neurosift-blog

November 2024

Neurosift blog main page

Dendro processing for Neurosift visualizations

2024-11-19

Most Neurosift visualizations are client-side, streaming only the essential portions of NWB files. However, some Neurosift visualizations rely on precomputed data to ensure a smooth, responsive user experience. This preprocessing is handled seamlessly using Dendro, which manages and processes containerized jobs to prepare data for visualization.

Examples of precomputed data:

How it works

  1. Queue Jobs: A batch script runs daily, identifying new jobs needed for public Dandisets on the DANDI Archive. These jobs are added to the Dendro job database as pending tasks.
  2. Process Jobs: Compute clients running on various machines (e.g., the Neurosift workstation and soon compute resources at MIT) retrieve jobs, execute them in containers, and update their statuses in the database.
  3. Upload Results: Processed results are uploaded by the compute clients to a Dendro cloud bucket, where Neurosift can access them for visualization.

You can track the current state of processing and job statuses on the Neurosift Compute Dashboard.

image

Contributing compute resources

The system is designed to be open and collaborative, enabling users to contribute their own CPU cycles to Neurosift’s processing. Here’s how you can get involved:

  1. Authorization: Contact the Neurosift team to gain permission to process data for the Dendro-powered Neurosift service.
  2. Run a Dendro Compute Client: Install and configure the client on the machine where you’d like to contribute resources. The client will automatically pull jobs, process them in containers (docker or apptainer), and return results to the Dendro system.

By contributing compute power, you can help expand the scalability and responsiveness of Neurosift’s tools for the neuroscience community. This approach allows for the easy addition of pluggable compute clients on arbitrary machines and resources, enabling seamless scalability as new computational needs arise.

OpenNeuro integration for viewing EDF files

2024-11-13

Neurosift views of EDF files are now integrated into OpenNeuro.

To try it out, go to this dataset and find a file with an .edf extension from an ieeg folder. Click on the view button to open the detailed file view. The Neurosift view is embedded in an iframe. See this discussion and this previous post.

image

image

Jupyterlab extension for viewing Neurosift figures

2024-11-13

The neurosift_jp jupyterlab extension now allows you to view Neurosift figures directly in Jupyterlab.

pip install --upgrade neurosift_jp

Here’s an example usage:

from neurosift_jp.widgets import NeurosiftFigure

f = NeurosiftFigure(
    nwb_url='https://api.dandiarchive.org/api/assets/11c8af45-e9ad-4305-9737-12a490328e49/download/',
    item_path='/acquisition/treadmill_velocity'
)

display(f)

image

In the future, this can be more tightly integrated with other tools such as pynwb.

Next step is to teach the AI agent how to generate these figures.

Interactive views in the chat window

2024-11-12

The chat agent now has the capability of including interactive views embedded in the output. Right now this is only implemented for RasterPlot for a Units table and an interactive view of TimeIntervals objects. Here is an example:

image

Using remote Jupyter kernels for Neurosift’s chat agent

2024-11-08

In my last post, I introduced the ability for Neurosift’s AI chat agent to execute scripts using a local Jupyter kernel. This feature works by connecting to a local server configured to accept connections from the Neurosift website, sending code for execution, and receiving output via WebSocket. While convenient, this setup is limited to local resources. For remote setups, like a DANDIHub instance, additional challenges arise, including CORS configuration and JupyterHub authentication.

To address these challenges, I’ve implemented the neurosift-jp Jupyter extension. This extension integrates the Neurosift chat agent directly into the JupyterLab interface, enabling it to use a remote Python kernel running on JupyterHub servers. For example, you can now harness DandiHub resources for script execution by the chat agent.

How to Get Started:

image

Embedded plots in the chat window

2024-11-06

The chat agent now has the ability to generate inline plots using either Matplotlib or Plotly. The agent creates Python code to generate the plot, the user inspects it and gives permission to run using their Jupyter kernel, and then the resulting plot is included as a PNG image (in the case of matplotlib) or as an interactive plot (in the case of Plotly).

For example:

image

These chats can also be saved, and the data for the embedded plots are saved as well.

Running Python Code Directly From the Chat Window

2024-11-01

It is now possible to run Python code snippets generated by the Neurosift AI agent directly from the chat window. It requires that you have a jupyter server running on your local machine on port 8888 (which is the default port for jupyter lab). You also need to make sure that the neurosift website has permission to interact with the kernel. Run this command to start it up:

jupyter lab --NotebookApp.allow_origin='https://neurosift.app' --no-browser

and make sure it is running on port 8888.

Then open a chat window and ask something like “Create a plot for a random NWB file in a random Dandiset”. The agent will generate the code for you, and you can click the play button above the code snippet to run it in your local jupyter lab environment. You should see the output on the right side of the chat window. Of course it’s possible that the code has bugs, in which case you will see the error messages. It is also possible that the AI chose something that will take a very long time. You can use the stop button in the right panel to halt the execution.

Here’s an example chat that plots some LFP traces:

image