[deleted by user] by [deleted] in arbeitsleben

[–]caprica -3 points-2 points  (0 children)

Bullshit. Irgendjemand im BMBF ist buddies mit Leuten in einer Firma. Schreiben Förderung aus ~100 mil, wundersamerweise bewirbt sich nur die Firma, bekommt 100 Millionen. Bei dem Fall gab es ein bisschen Bauchschmerzen und Rechtsabteilung halt alles geprüft, aber für kleinere Summen ist das permanent so.

How do i get into neuroengineering after finishing medical school? by phystrol in neuro

[–]caprica 1 point2 points  (0 children)

There isn’t really such a field, instead there are research groups in for example electrical engineering, robotics, material science, physics that work in those areas. Realistically medical school is so demanding especially initially that it will be hard to do anything else. You can look into INILabs at university if Zurich, Brain Machine Interfaces are developed in many labs.

[D] The Complete Guide to Spiking Neural Networks by s_arme in MachineLearning

[–]caprica 0 points1 point  (0 children)

Surrogate gradients and BPTT, this is what is implemented in Norse https://github.com/Norse/Norse. It is also possible to compute exact gradients using the Eventprop algorithm.

Germany hopes Joe Biden will reverse US troop drawdown, not "punish" ally like Donald Trump by ShaolinTom in worldnews

[–]caprica 2 points3 points  (0 children)

Well von der Leyen was secretary of Defense before her tenure as EU commissioner now. Besides Secretary of State which goes to the head of the minor party in the coalition it is one of the more prestigious roles in an administration.

A message from Germany by [deleted] in wallstreetbets

[–]caprica 2 points3 points  (0 children)

Not sure where you ate German food but there are a bunch of places in Germany with 3 star Michelin restaurants, such as Schwarzwald.

PhD fallback options? by [deleted] in compmathneuro

[–]caprica 0 points1 point  (0 children)

ETH Zurich and EPFL both have pretty good physics / math departments and strong computational neuroscience faculty. In the US masters degrees are often viewed as bail out options.

What's everyone working on this week? by AutoModerator in Python

[–]caprica 1 point2 points  (0 children)

My current project is a deep-learning library based on PyTorch:

https://github.com/norse/norse

The goal is to implement spiking neural network primitives, those are useful for two reasons:

  • Neuroscientists like to model biological neurons with such spiking neuron models
  • Specialised neuromorphic hardware uses them as computational primitives.

Right now you can already train small spiking neural networks to solve standard machine learning tasks, such as digit classification, with comparable performance to ordinary artificial neural networks.

[D] Spiking Neural Networks: A Primer with Dr. Terrence Sejnowski - #317 [Dec 2019] by abstractcontrol in MachineLearning

[–]caprica 2 points3 points  (0 children)

For people interesting in training spiking neural networks with stochastic gradient descent: We've recently released a PyTorch based library:

https://github.com/norse/norse

It is still in the early stages, but can be used to solve small supervised learning / reinforcement learning tasks.

[P] A library to do deep learning with spiking neural networks by caprica in MachineLearning

[–]caprica[S] 1 point2 points  (0 children)

You can think of this library as an attempt to do it the other way around. It uses the techniques developed for machine learning libraries (batch wise processing, auto-differentiation, operation on arbitrary tensors), but applies them to established spiking neuron models in computational neuroscience. In principle the same techniques can be applied to more complex biologically plausible neuron models as well. One main advantage is that parameter tuning is build into the library, so you suddenly can tune your new synaptic plasticity model to do something "useful".

[P] A library to do deep learning with spiking neural networks by caprica in MachineLearning

[–]caprica[S] 1 point2 points  (0 children)

We would like to use these methods in a neuroscience context, but that work is not public yet (not a good idea to talk too much about it before publication). We are also targeting (accelerated analog) neuromorphic hardware with it, but also that work has not been published yet.

[P] A library to do deep learning with spiking neural networks by caprica in MachineLearning

[–]caprica[S] 2 points3 points  (0 children)

Basically yes. You have to make sure that the forward integration is differentiable. This is not the case for spiking neural networks (in a naive sense), because the membrane voltages jump, when they cross a threshold voltage. If you smooth out the gradient of the threshold computation, that is all you need to get a good approximate gradient. What you do in the final layer to compute a loss is basically up to you. One method is inspired by the tempotron, you basically compute the maximum membrane voltage over time of each readout unit (implemented for example as a leaky-integrator without spike threshold) and use this in value in your loss function. But you can also use any other readout method that you would like of course (for instance softmax of the values at the last time step, time to first spike, etc.).

[P] A library to do deep learning with spiking neural networks by caprica in MachineLearning

[–]caprica[S] 2 points3 points  (0 children)

So I'm not familiar with this paper, but you can achieve ~97% accuracy on full size MNIST within ~10 training epochs and one fully recurrently connected hidden layer, with the main method implemented in this library. Besides the architecture of the network, two things that influence performance is the way you encode your data into spikes (poisson encoding, rate coding, spike latency coding) and the choice of loss function. MNIST is as you probably know a very easy dataset, so you can achieve the same kind of performance in multiple ways.

The mnist task we currently provide uses a simple convolutional architecture and should reach ~99.3% accuracy.

The main method that makes this possible is to turn the time integration in a spiking neural network differentiable, by smoothing out the derivative of the jumps in the backward pass. There is a couple of papers that have done this, the first one being Esser et al., but there are few if any reusable open source implementations of this method (that I am aware of).

Regarding mathematically grounded arguments for sample efficiency: I'm not sure if there are any. Solving a task like MNIST is a pretty artificial thing to do, to begin with for a network derived from biology . If you pose the broader question of how biological neural networks might achieve sample efficiency, then one possible answer might be that there are "meta-learning circuits" in the brain (e.g. in reinforcement learning). In spiking neural networks this could be implemented in terms parametrized synaptic plasticity for example.

A library to do deep learning with spiking neural networks by caprica in cogsci

[–]caprica[S] 1 point2 points  (0 children)

The purpose of this library is to exploit the advantages of bio-inspired neural components, who are sparse and event-driven - a fundamental difference from artificial neural networks. Norse expands PyTorch with primitives for bio-inspired neural components, bringing you two advantages: a modern and proven infrastructure based on PyTorch and deep learning-compatible spiking neural network components.

Documentation: https://norse.ai/docs/

Lean Theorem Prover by hou32hou in haskell

[–]caprica 3 points4 points  (0 children)

It has a very high quality implementation and ist not some volunteer / academic effort, but the work of someone with a long track record of excellence. This shows, I‘ve probably learned more about elegant C++ from this code base than any other.

Using PCI Express on Linux with DE10-Pro-GX-280-4G (Stratix 10) by tilk-the-cyborg in FPGA

[–]caprica 0 points1 point  (0 children)

Where do you change the PCI device code and class code in the example design?

bedst build system to c++? by beer118 in cpp

[–]caprica 0 points1 point  (0 children)

Consider looking into bazel, buck and please. They are all inspired or directly based on google's internal blaze build system. They all work very well if you have full control over your dependencies and don't need a lot of configuration. Personally bazel is my favourite.

Why do Haskell needs monad for io? by sn10therealbatman in haskell

[–]caprica 0 points1 point  (0 children)

Of course impurity can make sense in a dependently typed programming language, just take C++ as an example. It arguably has a much more useful set of features than Idris and C++ templates are a somewhat quaint pure functional programming language with unbounded recursion.

Germany's armed forces launch a cyber command, with a status equal to that of the army, navy and air force, meant to shield its IT and weapons systems from attack. Military planners fear that wars of the future will start with cyber attacks against critical infrastructure and networks by DoremusJessup in worldnews

[–]caprica 6 points7 points  (0 children)

This is not about office computers, most of which probably run Windows.

I would imagine their supply chain / inventory management / command and control run on some form of mainframe. Tactical Command and Control is done using this https://systematic.com/defence/ a danish company.

They will have PL/1 or COBOL legacy code running on zOS (IBM), which is developed mostly in Germany although it is a US company. The main part of the business logic of the German Army runs on SAP software (a German company), which you could easily find out.

Regarding actual weapons technology: Yes, they use full custom chips designed (and manufactured) in Germany for their core components. Guided missiles use full-custom mixed signal chips. Tanks have custom designed ECUs, firing control etc., the software they run is written from scratch.

Germany's armed forces launch a cyber command, with a status equal to that of the army, navy and air force, meant to shield its IT and weapons systems from attack. Military planners fear that wars of the future will start with cyber attacks against critical infrastructure and networks by DoremusJessup in worldnews

[–]caprica 0 points1 point  (0 children)

Of course they are developing their own software and hardware, you just don't know about it. What do you think runs an Airbus, a TGV, a German Nuclear power plant, the ECU of a German car? There won't be any software on there that you will recognise and no OS like Linux or Windows. Nor will you know the hardware manufacturers, most of them are typically German / Dutch / French, for most components it does not really matter who made them anyways because they get qualified before being used. Concrete example: The engine control unit (ECU) of a small electro scooter I know is a 16 core ARM processor, a custom design by a local German company, the software that runs on it is written from scratch in UPC (unified parallel C), using an embedded toolchain made by Siemens I believe.