Display MPLD3 graph offline - python

I have a couple of graphs that need to be shown on a local website, they're made with MPLD3, and I used the save_html option. However, I've just been told that the graphs need to be able to be viewed offline, so I wanted to know if there was a way to do this without mpld3.show(), because I need the graphs embedded in the website.

Please elaborate if possible what you mean by "local website". It sounds like you have an index.html file on your hard drive that you're rendering in a browser.
If that's the case and you want this to work with no internet connection, then it's likely you'll have to embed the D3 javascript dependency and the mpld3 javascript dependency into the html file after you save it to the a file. I think the default behavior is to retrieve those libraries from a cdn rather than embedding them in full.
Another option would be to try using the fig_to_html() function kwargs d3_url= and mpld3_url= to set the paths to your locally stored D3 and mpld3 libraries using a "file://" prefix rather than the "https://" prefix (again, this just avoids loading the dependencies via cdn).

Related

Configure Anti-Aliasing parameter in Arcigs Server Service in python

I need to set the anti-aliasing parameters for map services via python. Not having much luck in locating on how to configure this.
I have tried looking into the.sddraft file as mentioned in the documentation: https://desktop.arcgis.com/en/arcmap/10.6/analyze/arcpy-mapping/createmapsddraft.htm
However, when I open the .sddraft file as a xml, I cannot find the textAntiAliasingMode or AntiAliasingMode parameter in the file. Therefore I cannot use xml.dom.minidom to update it.
Any ideas??

How can another user run my Python code and access its Dash outcome without sharing source code

I use Google Colab in which I have defined a (Python 3) function.
When function is called with several selected parameters, it takes source files (CSV) from the Google Drive and executes a code. An outcome of the function is a couple of figures that I visualize using Dash on my local computer.
I want other users, too, to be able to remotely run the source code (ideally placed on Google drive still), define the parameters in a text box, click OK button, and see the dashboard with figures on their computers.
All that without any installation and sharing the sourse code. Just click on some link, define parameters, confirm them, run my Google Colab notebook, see a dashboard. Is this even technically possible and if so, how?
I am aware of solutions that involve sharing a source code, creating web pages or apps etc. I would not prefer any of that.
Thanks a lot for your inputs.

Is there a way to automatically download data from a utility using python?

I'm trying to download files automatically from a utility, like PG&E, so I can view my statements easily since I have a couple of different accounts for one login. Currently, I'm using python to figure it out, but I'm stuck on how to actually write the code to download the files. As of right now, I can only access all of the statement information within Python, but cannot figure out how to download the actual bills itself. Does anyone have additional information or resources that you can share so I can try downloading bills using python?

How to give bookmark links in Jupyter Notebook that work on GitHub? [duplicate]

So there is a lot out there about creating anchors in markdown, and creating internal table-of-contents-type anchors in a notebook. What I need though is the ability to access an anchor in my notebook on Github from an external source, e.g.:
https://github.com/.../mynotebook.ipynb#thiscell
I've got a number of interactive tutorials hosted this way, and a single manual that I want to be able to link to sections of the notebooks for. I can add the anchor tags into markdown cells just fine, using:
<a id='thiscell'></a>
but when I try using the link as I wrote above, it just loads the notebook at the top, as if there was no reference to an anchor.
GitHub renders notebooks using a separate domain, render.githubusercontent.com, and integrates the output in a nested frame. This means that any anchors on the GitHub URL won't work, because the framed document is a different URL entirely.
Moreover, the framed content is not easily re-usable, as the result is a cached rendering of the notebook with a limited lifetime. You can't rely on it sticking around for later linking!
So if you need to be able to link to sections in a notebook, you'd be far better off using the Jupyter notebook viewer service, https://nbviewer.jupyter.org/. It supports showing notebooks from any public URL including GitHub-hosted repositories and GitHub gists. You can also just enter your GitHub user name (or username/repository) for quick access.
This notebook viewer is far more feature-rich than the one GitHub uses. GitHub kills all embedded JavaScript, and strips almost all HTML attributes. Any embedded animations are right out. But the Jupyter nbviewer service supports those directly out of the box.
E.g. compare these two notebooks on nbviewer:
https://nbviewer.jupyter.org/github/mjpieters/adventofcode/blob/master/2018/Day%2020.ipynb
https://nbviewer.jupyter.org/github/mjpieters/adventofcode/blob/master/2018/Day%2021.ipynb
with the same notebooks on GitHub:
https://github.com/mjpieters/adventofcode/blob/master/2018/Day%2020.ipynb
https://github.com/mjpieters/adventofcode/blob/master/2018/Day%2021.ipynb
The first one contains an animation at the end, the second has a complicated table made easier to read by use of some HTML styling and anchor links.
I had the same problem. As a workaround, I have delegated the rendering of my notebook to http://nbviewer.jupyter.org. It's just a matter of providing its GitHub public url and clicking Go!
Of course, the internal links still don't work under GitHub, but I have now a functioning notebook somewhere on the web, which is what I actually wanted in the first place.
I hope this applies to your case too.

Bokeh Server Files: load_from_config = False

on Bokeh 0.7.1
I've noticed that when I run the bokeh-server, files appear in the directory that look like bokeh.data, bokeh.server, and bokeh.sets, if I use the default backend, or redis.db if I'm using redis. I'd like to run my server from a clean start each time, because I've found that if the files exist, over time, my performance can be severely impacted.
While looking through the API, I found the option to turn "load_from_config" from True to False. However, tinkering around with this didn't seem to resolve the situation (it seems to only control log-in information, on 0.7.1?). Is there a good way to resolve this and eliminate the need for me to manually remove these files each time? What is the advantage of having these files in the first place?
These files are to store data and plots persistently on a bokeh-server, for instance if you want to publish a plot so that it will always be available. If you are just using the server locally and always want a "clean slate" you can run with --backend=memory to use the in-memory data store for Bokeh objects.

Categories