Jupyter Notebook
Need help with dataviz-with-python-and-js?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.
Kyrand

Description

The accompanying files for the book 'Dataviz with Python and JavaScript'

221 Stars 123 Forks Other 12 Commits 2 Opened issues

Services available

Need anything else?

Data Visualisation with Python and JavaScript: Crafting a dataviz toolchain for the web

This repo contains the code to accompany the O'Reilly book Data Visualisation with Python and JavaScript. It's currently being refined, prior to the book's release in early July 2016.

Installing Dependencies

The instructions in Chapter 1 Development Setup should provide you with a basic Anaconda setup, providing the main Python data analysis and visualisation tools. I recommend using a virtual environment, either using Anaconda's conda command:

$ conda --create pyjsviz anaconda

or using virtualenv:

$ virtualenv pyjsviz

With the virtual environment activated, any extra dependencies can be installed using the

requirements.txt
with pip:
$ pip install -r requirements.txt

You should now have all the Python libraries you need.

Seeding the MonogoDB Nobel-prize database

In order to seed the database with the Nobel-prize winners dataset, use

run.py
:
$ python run.py seed_db
Seeded the database with 858 Nobel winners

You can drop the database like so:

$ python run.py drop_db
Dropped the nobel_prize database from MongoDB

Using the Jupyter (IPython) Notebooks

There are notebooks to accompany chapters 9, 10 and 11. To use them just run Jupyter (or IPython for older versions) from the command-line in the root directory:

$ jupyter notebook
...
[I 20:50:56.397 NotebookApp] The IPython Notebook is running at: http://localhost:8888/
[I 20:50:56.397 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).

You should now be able to select the notebooks from your default web-browser and use them to follow their respective chapters.

Note - D3 version 4

Since the book was published D3 has shifted versions, necessitating some conversion work from existing v3 visualizations. One key change is a flattening of the namespace, e.g.:

  • d3.scale.linear ↦ d3.scaleLinear
  • d3.geo.path ↦ d3.geoPath

This is an easy adaption. A harder on is the change in the way the

enter
method works, which affects the existing examples of the all important update pattern.

A version 4 compliant Nobel Visualization can be found in the

nobel_viz_D3_v4
directory. The following code snippets from the Nobel barchart show the changes made to the update pattern and the use of the new
merge
method. Because the enter method no longer magically updates its selector we need to merge back the original selection to the newly appended elements (created by data-joining) in order to change their attributes in one go.

D3 Version 3: ``` var bars = svg.selectAll(".bar") .data(data, function(d) { return d.code; });

    bars.enter().append("rect")
        .attr("class", "bar")
        .attr("x", xPaddingLeft);

bars
    .classed('active', function(d) {
        return d.key === nbviz.activeCountry;
    })
    .transition().duration(nbviz.TRANS_DURATION)
    .attr("x", function(d) { return xScale(d.code); })
    .attr("width", xScale.rangeBand())
    .attr("y", function(d) { return yScale(d.value); })
    .attr("height", function(d) { return height - yScale(d.value); });

bars.exit().remove();

D3 Version 4:
    var bars = svg.selectAll(".bar")
        .data(data, function(d) {
            return d.code;
        });

bars.enter().append("rect")
    .attr("class", "bar")
    .attr("x", xPaddingLeft)
.merge(bars)
    .classed('active', function(d) {
        return d.key === nbviz.activeCountry;
    })
    .transition().duration(nbviz.TRANS_DURATION)
    .attr("x", function(d) { return xScale(d.code); })
    .attr("width", xScale.bandwidth())
    .attr("y", function(d) { return yScale(d.value); })
    .attr("height", function(d) { return height - yScale(d.value); });

bars.exit().remove();

The Nobel Visualization

The Python and JavaScript files for the Nobel Visualization are in the nobel_viz subdirectory. These include the config, login and test files demonstrated in the book's appendix:

nobelviz ├── api <-- EVE RESTful API │   ├── servereve.py <-- EVE server │   └── settings.py ├── config.py ├── index.html <-- entry index.html file for static Nobel-viz ├── init.py ├── nobelviz.py <-- Nobel-viz server ├── nobelviz.wsgi ├── SpecRunner.html ├── static │   ├── css │   ├── data │   ├── images │   ├── js │   └── lib ├── templates │   ├── index.html <-- template for entry html file for dynamic Nobel-viz │   ├── login.html │   └── testj2.html ├── test_nbviz.py ├── tests │   ├── jasmine │   └── NbvizSpec.js ├── tests.js └── tests.pyc ```

Running the Nobel-viz

You can run the Nobel-viz in two ways, one using static-files to emulate an API which can be run without MongoDB and the other using the EVE RESTful API with the Nobel winners dataset seeded by using

run.py
.

Running it statically

To run the Nobel-viz statically just run Python's

SimpleHTTPServer
server from the
nobel_viz
directory:
nobel_viz $ python -m SimpleHTTPServer
Serving HTTP on 0.0.0.0 port 8000 ...

If you go to the

http:localhost:8000
URL with your web-browser of choice, you should see the Noble-viz running.

Running it dynamically with the EVE-API

To run the Nobel-viz using the EVE RESTful API, first start the EVE server by running it from the

nobel_viz/api
subdirectory:
nobel_viz/api $ python server_eve.py
 * Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)
...

With the API's server running on default port 5000, start the Nobel-viz's Flask server from the

nobel_viz
directory:
nobel_viz $ python nobel_viz.py
 * Running on http://127.0.0.1:8000/ (Press CTRL+C to quit)
...

If you go to the

http:localhost:8000
URL with your web-browser of choice, you should see the Noble-viz running.

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.