Really? That would be pretty awesome, but also quite brave! (^_^)
I can't help wondering if there was a way to abstract it out so instead of getting access to a real compiler (and all that CPU and IO load), instead people could have a web-based means to request PyPI modules. The requests get serialised into a single queue and processed in order, and either installed into the general environment for everyone, or perhaps left in the user's own space but the build artifacts cached so the next user to request it gets it instantly. The latter approach would be more complicated, but would allow people to request specific versions for themselves.
Maybe it's just a crazy idea and it's definitely a little vague, but it would avoid lots of people having to spend the time (and available resources) compiling up the same modules. Almost certainly more complicated to implement, of course, which is a fairly significant downside.
Of course, you could alternatively just compile every released version of every package on PyPI and make them available in binary form... Hm, I can't decide if I'm entirely joking when I say that! (o_O)
EDIT: I suppose in hindsight having access to a compiler isn't actually any worse than being able to execute arbitrary code anyway, and getting that working is probably the easiest and most flexible anyway. I still like the idea of a package cache for speed and ease, but as I say it's probably quite fiddly to implement - maybe create a system protected by a writable unionfs layer on top, and then simply recover every file created during the installation process and tar them all up. Or something. One to file under "if only time were infinite...", perhaps!