If your code involves multiprocessing which you're running from a ipython notebook then it might fail because of a duplicate signature exception.
The basic idea here is that jupyter notebooks are designed in a client server fashion. There are 2 main components to this, the front end which is an abstraction of the jupyter client and the kernel (ipython kernel). The frontend or the jupyter client encompasses 2 more components, the proxy to frontend and kernel proxy. The communications are done via a low level Message Queue lib, kernel proxy which is a component of the jupyter client is responsible for connecting to the different ipython kernel sockets and marshalling/unmarshalling the data. Whereas the frontend component of jupyter client is responsible for presenting the data to stdout and accepting inputs from stdin (notebooks).
This exception is raised from jupyter client session.py module. Basically why it fails is because if your multiprocessing library is using fork (for creating subprocesses) inherently then jupyter client abstraction (which is basically the wrapper around your code) created uses the same session id which must be unique for every client process. This causes a duplication resulting in the exception.
The resolution to this would be to either upgrade your jupyter client to any version above 7.0 or otherwise move the function to be run from a separate file and modify the notebook code (employing the multiprocessing lib) to explicitly use spawn to create new subprocesses. This would involve extra work for initialisation because you'd have to split your code into a module and import it into the notebook. There's also a register_at_fork function from the os module, wherein you could register callables to be executed when a new child is forked, this could involve resetting the session id for the client process.
Affected Jupyter Client Version: <7.0