Forums

IPython notebook

I know IPython consoles are available, but is it possible to have access to an IPython notebook on PA? Thanks

Not yet, but we're working on it.

Awesome. Will there be a blog post about it once it is done? How close is it to being ready?

There will definitely be a blog post when it's done, but we don't know when that will be.

ok thanks

+1!!!

+1 here too!

:-)

Like Glenn said, we're actively working on it right now. It definitely won't be in the next release (which should be over the next week or so). But it'll go into a internal beta (eg. PythonAnywhere developers only will see it) either the release after next, or the one after that. Maybe a month or so from now, but that's a guess rather than a commitment...

If it all looks OK after the internal beta, and we decide we don't need to change/fix anything before moving to an external beta, we'll email everyone who has expressed an interest and invite them to try it out -- it's something we can enable/disable on a per-user basis.

Perhaps it's worth saying roughly how it is likely to look -- again, this is not definite, it's just what we have done so far. There will be new options on the "Consoles" tab, so as well as being able to start a Python console, an IPython console, a Bash console and so on, you'll also be able to start an IPython Notebook with Python 2.7, 3.3 or 3.4. When you click on those links, you'll be taken to the Notebook. From there you can do pretty much any normal IPython Notebook-y things, including saving it to your private file storage. (It's likely to be saved as something like "Untitled1.ipynb" in your home directory by default.) They run in the same context as your normal consoles, tasks and web apps and have access to all of the same stuff.

Additionally, because IPython notebooks are files, they'll show up in the "Files" tab. If you click on them there, they will start up.

Does that sound good?

Sounds great. Thanks!

Spot on! Can't wait for it!

Dig it! I want sincerely to "express interest" and be in on the beta!

@giles: sounds good to me. will i be able to specify a virtual environment to run the notebook under? for example, tmpnb.org gives you pandas and matplotlib, but what if I want Bokeh or Seaborn?

thanks

We're not sure about the virtualenv thing yet, but what we do know is that packages installed into your user environment will be available to your notebooks.

+1 for notebooks.

Looking forward to it! Cheers!

Hi there, we now have an early beta of the IPython notebooks feature which we're ready to start inviting people to. If you're on this thread, you should find you now have the feature.

We're still working on the UI, but if you go to the "Files" tab, you'll find the option to start new notebooks. You'll also be able to view any existing notebooks you have or upload.

Let us know what you think!

First of all: I really appreciate you working on requested features like this one!

An idling notebook seems to put its user into the tarpit pretty quickly. Any ideas whether that's inherent to a notebook?

Cheers, Tom

I think notebooks are just pretty CPU-intensive -- they are doing an awful lot of clever stuff in the background! But, you know, paid accounts get you lots more CPU quota, and they start from just $5 per month, just sayin'....

Do you find that being in the tarpit makes the notebooks totally unusable? Or just slow?

Thanks for reminding me about the paid accounts. ;-) Had one for several months and might come back one day. Just a shift of priorities, currently.

Still, I'm wondering whether a local notebook causes the same amount of CPU load even without having the notebook open in a browser.

Being in the tarpit does not have a significant impact on the notebook's performance (e.g. plt.plot(np.random.randint(10, 10000, 100000)) takes about a second to plot).

The notebooks I've started all seem to remain in the process list. Do I close them wrongly?

When you close the tab, the notebook kernel keeps running, ready for you to re-open it later. It hadn't occurred to us that people might want to "close" them, and shut down all processes... Are you worried about CPU usage? It shouldn't use much just ticking over... The main usage should only take place when you recalculate cells, and even then only ones with big calculations in...

No problem on my side if 30 something NBs are in the list, but they should not drain CPU usage.

I am posting this here because I see there has already been discussion of idling notebooks putting unpaid accounts in the tarpit quickly...

This has been happening to me the last few days too, and so about 24 hours ago I deleted the test, single-celled IPython notebook I had set up. However, this morning I was shocked to see I got the message I was again put in the tarpit. (152.33s of your 100 second CPU allowance.)

I see now on the Consoles tab that I can look at Running process and sure enough there are still two that relate to the now deleted notebook, I believe. I am fairly sure since the only other consoles I have are bash. Here is the first part of each of the two processes

/usr/local/bin/python2.7 -m IPython.kernel -f ...
/usr/local/bin/python3.4 -m jupyterhub.singleuser --user=...

I am going to press the red kill button to the right of each and hope upon allowance resetting that I will no longer be automatically moved to the tarpit pretty much right away. I am posting this in case someone can enlighten me that is not the case or it helps anyone else scratching their head after deleting an IPython/Jupyter notebook.

A related question would concern how much I went over. In the Running processes list says it used 4.39 CPU and 9.68 CPU, respectively. I don't see how that adds up to 152.33s of your 100 second CPU allowance. Fuzzy math? It only counts up to when they are put in the tarpit?

Just another comment related to the Ipython notebooks, it seems that the ipython notebook consoles are registering in the access log of webapps (I have two and it seems to registered in both of them). I don't know if this will cause an impact on the access of the webapps.

@wayne461 yes- a server is spun up to serve notebooks associated with your account, which is why it is resource hungry. That is also why deleting one ipython notebook won't stop the notebook processes automatically, since there are a bunch of moving parts (eg: your other notebooks that you have not deleted might still be running stuff etc) For now, you will have to go into the process list to kill it. We are looking into options such as adding a button to kill/stop the server.

For the cpu accounting stuff, processes that have been killed or finished running do not show up in the running processes list, so that probably explains why you weren't able to get the seconds to add up.

@lint78 thanks for msg. We will look into it. At first glance, there shouldn't be an impact on webapps- I think our log parsing stuff that we setup to separate out the logs for you etc just needs tweaking to work with ipython notebooks.

Thanks for the clarification, Conrad. I'll see what happens at the next reset.

I still don't quite follow the CPU accounting since at that point I had not yet killed the processes, but I was being told I had used 152.33s of your 100 second CPU allowance. I had no other processes running in my account besides that in the active allowance cycle.

Conrad,

I am sorry but what I said was not correct, it doesn't register in the access logs. I got confused because I was including an iframe with one of my webapps in a local notebook and that was what was being registered in the access log. I think there is no confusion happening with pythonanywhere Ipython notebook logs. Sorry about that. Luis

@wayne461, re: the cpu seconds not adding up, I wonder if what's happening is that the notebook processes are occasionally restarting, killing old ones and starting new ones? that would explain how it managed to use 150s, but the only processes alive at a given point in time were using 5s.

In any case, we'll look into a way of either killing those processes automatically, or giving people an easy option to do it manually...

Hi, I wondered if the option to stop an ipython kernel has been implemented yet? When running the notebook on your local PC you can stop them from the jupyter dashboard, but this dashboard doesn't exist on PA. I was wondering why I had used 25% of my daily CPU without doing anything and then noticed the ipython kernels still running, which I guess explained things? Anyway, I'll kill them from the consoles page which is easy enough to do... Just a thought to run a notebook on one of my remote (non PA) servers I execute "jupyter notebook --no-browser" from the terminal which then makes the notebook run through port 8888. I then link to it from my local PC using an SSH tunnel assigning it to a port on localhost. This allows me full notebook capabilities and I can stop kernels from running on the dashboard. Is a similar setup possible with PA?

@stephenl6705, re whether it's possible to run the notebook via a tunnel, there's only one way to find out! one possible hiccup is that somebody else might have grabbed port 8888 on that server, so you may need to figure out how to tell jupyter to use a different port...

I tried it, but I get the following error: channel 2: open failed: connect failed: Connection refused I have no issue trying to SSH into my account.

No need to investigate though on my account. The connection to the notebook on PA is already secure and it's only a click away. I have no issue with killing the processes. I was just curious to see if I could build the connection differently.

There seems to be something wrong currently with the notebooks. Each NB I tried led to a dying server, even empty/new ones.

Everything looks OK at the moment, and I can't see anything in the logs. Are they still not working for you? Are you trying them from a different location, perhaps? They use non-HTTP/S ports to connect back to our servers, so if you're trying them from a place that's behind a proxy then their might not work.

Just found out only the 2.7 kernels die. So, within same empty NB, switching to 3.0, all OK, switching to 2.7, kernel dead within a second. Quite reproducible.

That's really weird. If 3.x works then it can't be a network thing. I've confirmed that I can start a 2.7 notebook and run some simple code.

Aha! I've just spotted something interesting in the logs.

/usr/local/bin/python2.7: cannot import name Path

It doesn't say which session it was associated with, but timing-wise it looks like it's associated with you starting a 2.7 kernel.

Have you got any special configuration set up for IPython generally?

Not knowingly, checking *.ipynb file dates, I'd say, I did nothing since last time they worked ~2016-03-15.

Could you check your .ipython directory to see if you have any references to it? This, run from your home directory in a Bash console, should do the trick:

grep -Ir Path .ipython

Yields no output.

from this one potential thing might be that you have a different version for the path package. Did you happen to install any packages recently?

Otherwise, it may be that the path thing is a red herring.

Seems solved. There was a forked path package, uninstalling did help.

THX!

Excellent :-) Thanks for confirming!

Is there a way to give a non-PA user access to a jupyter notebook that I create?

Not within PythonAnywhere, unfortunately -- there are some nasty security issues with sharing notebooks. But you could just email the .ipynb file to them?

fizixgeek, If you are looking to avoid your user to have to install something to make Jupyter notebooks work actively on their machine and your notebook doesn't need to be absolutely secure, you can either e-mail it to them and have them upload it to tmpnb.org or put it in on Github and point mybinder.org at it to make an active notebook. I have used the latter a good deal. You can see good examples of the binder approach here and here . Of course, the repo can be set up simpler too. You can supposedly use private repos to build a binder as well, but I have not yet tried that.

And if you don't need it active, you can just put it on Github and it will be rendered by nbconvert automatically now.

Count me as a paid user losing about 25% of daily CPU time to idle IPython notebooks. Is killing them via the Consoles page the only way to prevent this, or can they be shut down gracefully? Will a "killed" notebook fire up just fine when I need to use it again? Thank you.

Hi @andy800, killing the ipython processes from the consoles page is currently the only way to switch them off, but we've never seen any problems with notebooks failing to fire up again after you kill them, so i wouldn't worry about it...

I started a notebook yesterday and it is consuming my CPU allocation. I cannot seems to get the 'Running processes' to list under the Consoles tab. It just hangs on 'Loading processes...'. (I wonder if it may be because the image in my account is very old?) I'd appreciate it if you could easily kill my notebook processes since I cannot see the list? Thanks.

We had a little trouble yesterday that may have prevented the process list from working. It is working again now.