Evolution in NERSC's workload over the past 5 years has been driven by the explosion of data in all of these science areas. Experimental and observational science facilities (microscopes, telescopes, genome sequencers, particle accelerators) increasingly need supercomputing to make sense of all the data they generate. Artificial intelligence (AI) models designed to solve complex problems are trained using both massive data sets and supercomputer simulations. Jupyter has become the de facto platform for data science and AI workflows in general, and at NERSC 
In 2016, we at the National Energy Research Scientific Computing Center (NERSC) began exploring whether we could rely on Jupyter to provide a rich user interface to the new Cori supercomputer alongside traditional command-line interface (CLI) and remote desktop (i.e. NX) access modes. Since then, Jupyter has become indispensable, serving as a primary point of entry to Cori and other systems for a substantial fraction of all NERSC users. Around 700 unique users per month use Jupyter on Cori, a figure which has tripled in three years. In a typical month about 3000 unique users log into Cori via CLI or NX, so in some sense 20-25% of user interaction with Cori goes through Jupyter.

How It Started and How It's Going

By 2015, we had observed with increasing regularity that users were trying to use SSH tunnels to launch and connect to their own Jupyter notebooks on Edison, a previous generation supercomputer. One user even published a blog post \citep{a} about how to do it. NERSC recognized that Jupyter notebooks and similar tools were a part of the emerging data science ecosystem we would need to engage, understand, and support. Faced with the challenge of how we would authenticate users and launch, manage, and proxy their notebooks, we turned to JupyterHub, which had been released a few months prior to address just those issues. JupyterHub provides a managed multi-user Jupyter service to enable access to computational environments and resources. It has a highly extensible, relatively deployment-agnostic design built on powerful high-level abstractions (spawners, authenticators, services). It is developed by a robust, broad open-source community invested in its development and propagation. From the perspective of an organization taking on the challenge of supporting any platform for a diverse and demanding user base, these characteristics represent potential strategic leverage.