Programatically submit a job that draws from multiple python sources and a pickle file


I am trying to run a network simulation on SpiNNaker via the python neuromorphic API, however the job errors. The source code the simulation is a file I have called: ‘’

I am just wondering, if when I sumit the job using the nmpi client, should the command key word argument now be command=" {system}") ?
If not, how can I let know about the executing the model in {system} ? Also what does {system} mean? The code that I most recently tried to execute is given below. The job launched, but it returned an error, probably without running a simulation.

import nmpi c = nmpi.Client("rjjarvis", token=token) job_id = c.submit_job(source="", platform=nmpi.SPINNAKER, collab_id=5458, inputs=["unit_test/connections.p","unit_test/"], command=" {system}")
Thanks for any help or advice.


Hi Russell,

The source argument should be the path to the script on your local machine, I guess unit_test/ The arguments to inputs need to be publicly-accessible URLs, and are expected to be data files, not code, so you should put the connections.p file somewhere online (e.g. public Dropbox folder). You can leave the command argument empty in this situation.

If your code is split over several files, the best way is to put all code and data into an online zip archive or public Git repository, and give the URL of the archive/repo as the source argument. In this situation, the command argument should contain the path to your main Python script within the archive/repo. You can leave out the {system} placeholder unless your script expects to read the PyNN backend name from the command line.

Note that we are planning a feature which would avoid you having to create and upload the archive yourself, see

Hope this helps,



I decided to go with the approach of drawing from GHub data/source code, so now I have:

import nmpi
c = nmpi.Client("rjjarvis")
token = c.token
collabs = c.my_collabs()
job_id = c.submit_job(source="",

job = c.get_job(job_id, with_log=True)
from pprint import pprint
filenames = c.download_data(job, local_dir=".")

The script returns:

Job submitted

{'code': '',
 'collab_id': '5458',
 'command': '',
 'hardware_config': {'resource_allocation_id': 202},
 'hardware_platform': 'SpiNNaker',
 'id': 120426,
 'input_data': [{'id': 65091,
                 'resource_uri': '/api/v2/dataitem/65091',
                 'url': ''}],
 'log': '',
 'output_data': [],
 'provenance': None,
 'resource_uri': '/api/v2/queue/120426',
 'resource_usage': None,
 'status': 'submitted',
 'tags': [],
 'timestamp_completion': None,
 'timestamp_submission': '2018-08-13T02:25:20.303038+00:00',
 'user_id': '302429'}

Is there a way of testing what paths are visible to the machine I am uploading to? Like a way of running os.system('ls *') from the server? I am unsure if I should be supplying absolute or relative URL paths as the input.




as if I had run ```git clone; cd HippNetTE``` on a server.

Also how far off is the new feature you allude to in GH issue 92?
Note that we are planning a feature which would avoid you having to create and upload the archive yourself, see"


I am also thinking maybe I could clone this repository:

And I could try to factor my code to coincide with test passing code inside:


This is now done (issue #9, not #92) - you can just give a local directory as the source argument, and all code and data in that directory will be uploaded for you.


Paths should be relative to the top-level of your Git repository, so the second case in your example.

os.listdir(".") should show the paths that are visible.

I did notice something strange in, you are importing pyNN.neuron, whereas I would expect to see import pyNN.spiNNaker if you are running on SpiNNaker.


Yes, you are right. Thanks for taking the trouble to glance over the source code.

I developed template code that I know at least runs with PyNN’s NEURON simulator backend first, in my haste to develop for SpiNNaker, I didn’t thoroughly abolish all imports and references to pyNN.neuron and h. I have fixed this problem in now.

Also I only just realised the changes you refer to are for the GUI portal access to SpiNNaker, which is great, I will try to get things working there first, and then hopefully I can piece together python nmpi approach from the job logs that are output with this approach.


It’s a good idea to test things through the GUI portal, but the ability to give a local directory as the source argument is available in the Python library, you just need to install the latest version.


In the short term, I got passed this issue by using wget to download raw GH content using os.system(‘wget etc’).

I know it’s not the intended use, but it was the only thing I could get to work in a short amount of time.

The code is now failing due to other python2/3 compatibility reasons. I wonder if there are plans to upgrade python at the SpiNNika interface? or should I invest in writing python2.7 compatibility code?