#331 Featured applications for Comp Neuro lab
Closed: Fixed 4 years ago by ankursinha. Opened 4 years ago by major.

We would need to list some featured applications of the Comp Neuro Lab. Preferably, get their logos as well.


Metadata Update from @major:
- Issue marked as blocking: #327
- Issue tagged with: D: Easy, F: Computational neuroscience, S: Next meeting

4 years ago

Metadata Update from @ankursinha:
- Issue assigned to ankursinha

4 years ago

How about this for a list? I am not familiar, but made my best guess.

NEST simulator
FSLeyes
Nistats
Brain2mesh
Mcxlab
Neurosynth
GNU Octave
ZMat
LaTeX

If we can decide on 9 apps to feature, I can visit each site and get logos and make them the right size and collect the URLs for the software websites for the "More info" link.

Thanks for that @dan1mal

This particular image is limited to Computational neuroscience software, though, so only NEST on that list qualifies. The rest are analysis/neuroimaging tools which will be featured on our neuroimaging image (once we have more tools packaged up).

The best known tools for computational modelling of neurons and neuronal networks that we have in this image at the moment are:

  • NEST
  • Brian
  • Neuron
  • GENESIS
  • MOOSE

In addition, other modelling tools that we're including at the moment are:

  • COPASI
  • PyLEMS
  • neurord
  • auryn
  • smoldyn

(This is from the complete list of packages in this image: https://pagure.io/fedora-kickstarts/blob/master/f/fedora-comp-neuro-common.ks#_16)

We'll keep adding more as we package more, of course :clap:

General purpose utilities like octave, LaTeX, etc are lower priority. They are "nice to haves".

What do you think?

I'm not one to really know which packages should be 'featured', but maybe since this is aimed at scientists showing that it includes R or Julia and @python-science apps would be beneficial in drawing interest.

On a side note, I was wondering if there was any interest in putting a NeuroFedora presence on LinkedIn. Perhaps, if nothing else, to connect with other Neuro research groups.

I'm not one to really know which packages should be 'featured', but maybe since this is aimed at scientists showing that it includes R or Julia and @python-science apps would be beneficial in drawing interest.

I'd put them as "Also, we have all of this analysis software".

For instance, in my lab, we spent 90% of our time working with simulators, and only 10% with the analysis tools---developing the model really is the main bit and only when that has been done does the analysis gain focus. The modelling tools are also the ones that are hard to build/deploy because of how customised they are. Python/R and so on are easy to deploy in comparison using their own eco-systems.

Our target audience is not data scientists or statisticians---it is computational modellers. So, I think we'd "sell" better if we focused on the modelling tools for the "CompNeuro" deliverable. That's our niche. Does that sound OK?

On a side note, I was wondering if there was any interest in putting a NeuroFedora presence on LinkedIn. Perhaps, if nothing else, to connect with other Neuro research groups.

Sure! Sounds good! I don't know any groups that use LinkedIn, but it'll be good to have a page for us there too. Can you look at how one sets it up please. Maybe file a new ticket to track that task? We will need to ensure that it is kept up to date, though. So let's keep the resource requirements in mind also. (I haven't set up a twitter account for us because we don't have the resources to keep it up and active.)

Is there any chance there is a survey of the top neuroscience modellers? I've tried to google descriptions for each modeller on our list to see which might be most "marketable" but I can't really tell except by how developed their website (or logo :smile: ) is.

This is my best guess:

auryn
bionetgen
copasi
genesis-simulator
moose
nest
gnuplot
neuronrd
smoldyn

Not to keep diverging onto new ideas, but perhaps some kind of long term project either within our group or outside in a neuroscience researcher group, would be some kind of annual survey of what is out there. It would also allow developers to link and describe their projects. I know github does an annual survey of dev tools and that kind of data is interesting to me. I can see what others are using and what other options might have potential and investigate them.

(I'm going to put on the "domain expert" hat here, sorry :stuck_out_tongue: )

Is there any chance there is a survey of the top neuroscience modellers?

Not really. Modeldb would be the best source: https://senselab.med.yale.edu/ModelDB/FindBySimulator

I think we need to distinguish specialist science software from general purpose development type software. These tools that we're working on are not like music players/editors/ides where there are lots with similar functions and varying sets of features. These are specialist tools that are aimed at particular functions. So, if you asked me for example what I'd use: neuron or nest or brain, it would really come down to what task/model I'm to work on. Not all models can be done in neuron, and not all of them can be done in nest or similarly in brain. They're not better or worse, they are different. We can call them all "simulators" but we shouldn't use that to imply that they're similar and interchangeable. They really aren't.

I've tried to google descriptions for each modeller on our list to see which might be most "marketable" but I can't really tell except by how developed their website (or logo 😄 ) is.
This is my best guess:
auryn
bionetgen
copasi
genesis-simulator
moose
nest
gnuplot
neuronrd
smoldyn

I'm afraid neuron must be on the list. It is one of the oldest, most advanced simulators around. It is used by a majority of researchers that do multi-compartmental modelling. Auryn, on the other hand is the newest one and least used one. It has been used in fewer than 5 models till date.

I totally understand that marketability is important, but all of this software is specialist research software, and it isn't developed with marketability in mind the way software in industry is---they rarely have designers/marketing people on their teams---grants don't give money for such activities. So, irrespective of whether they have flashy logos or not, the most commonly used one that we have packaged must be featured.

In the same vein, our target audience is not a generic user/generic scientist. It's a computational modeller who will not use a tool simply because it is marketed more. What we're trying to do is provide them will all they tools they may need---not nudge them towards a tool. That decision will be made on the basis of their research project.

So maybe the approach here should not be to find the flashiest one, it should just be to try and feature as many tools on the list as we can manage?

Not to keep diverging onto new ideas, but perhaps some kind of long term project either within our group or outside in a neuroscience researcher group, would be some kind of annual survey of what is out there. It would also allow developers to link and describe their projects. I know github does an annual survey of dev tools and that kind of data is interesting to me. I can see what others are using and what other options might have potential and investigate them.

Sure, ideas are always welcome, but again, given that we're not merely looking at generic dev-tools, I don't know how useful such a survey would be. :thought_balloon:

So, irrespective of whether they have flashy logos or not, the most commonly used one that we have packaged must be featured.

So maybe the approach here should not be to find the flashiest one, it should just be to try and feature as many tools on the list as we can manage?

All of the labs (eg. https://labs.fedoraproject.org/en/python-classroom/) have 9 programs that have icons and a short blurb about them. I think that's how many we should aim for that.

If we can agree on 9 simulators that should be featured applications, I can find the logos and site URLs and post them here.

Sounds good---would you have an hour for a video call next week? We can do it together then?

I'm finding Issues/comments quite tedious for such discussions. What do you think?

Here's the list based on the number of models on modeldb:

  • Neuron: 698 models
  • Brian (v1 and 2): 50 models
  • Genesis: 48 models
  • NEST: 14 models
  • Calc: 8 models
  • NeuroRD: 5 models
  • moose: 4 models

and then, two general tools:

  • Python scientific stack
  • R maybe? (We have r-core in Fedora, and modules can be installed using R's package manager like pip for python)

Metadata Update from @ankursinha:
- Issue close_status updated to: Fixed
- Issue status updated to: Closed (was: Open)

4 years ago

Login to comment on this ticket.

Metadata
Attachments 2
Attached 4 years ago View Comment
Attached 4 years ago View Comment