Home

also available in pdf

image

Contents

1. left eye gt left input left motor right eye right input right motor left input left motor AEERTEAERTNI viru wman mnn 1 mnnn t Y DU Le ANAOUN IOORRB em meMn DINDI IONII ITT VCE CETE TECTED EE EE EE ODDA ONONO DONON MMDA MMI 1 nth can A K i f i AAAA AONONOOUA MUONA MOONA EA OOOO ONOONO DURNO OOUR DANBAN NAMNI CERMEN AAONONAAAAMN ONMIN NAAN KINA ne PEEETEETEEENY Vain MICELLE MM 8 211 8 411 8 211 8 411 right motor NAVE MAMAE a preen ee A B TT KEATEN i SOS TUTTO Pupunu Stahl adh ee Cervera w UC OM MOMO wit THE tamnn 8 211 8 411 8 211 8 411 5 1990 i m 9 1990 data mode time step speed _ recording time filter time shown layout defau 0 001 osx aE 0 03E vH ee 1 5 2 Binding Semantic Pointers SPs e We want to manipulate sophisticated representational states this is the purpose of describing the Semantic Pointer Architecture SPA http nengo ca build a brain e The main operation to manipulate representations in the SPA is circular convolution for binding e Let s explore a binding circuit for semantic pointers e Input Two semantic pointers high dimensional vectors e Output One semantic pointer binding the original two e Implementation element wise multiplication of DFT i e circular convolution Note there is a drag and drop template for this operation open demo convolve py 1 02a 2 0 76a b 1 ae 2 0 465 0 965 x C 1 00b 0 57b c 0 43d
2. net connect SpeedNeurons Oscillator index_post 2 56 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 Define the nonlinear interactions in the state space of the oscillator def controlled_path x return x 0 x 2 speeds tau x 1 x 1 x 2 speedxtauxx 0 0 net connect Oscillator Oscillator func controlled_path pstc tau net add_to_nengo 2 4 Basal Ganglia Based Simulations 2 4 1 Basal Ganglia Purpose This demo introduces the basal ganglia model that the SPA exploits to do action selection Comments This is just the basal ganglia not hooked up to anything It demonstrates that this model operates as expected i e supressing the output corresponding to the input with the highest input value This is an extension to a spiking dynamic model of the Redgrave et al work It is more fully described in several CNRG lab publications It exploits the nps class from Nengo Usage After running the demo play with the 5 input sliders The highest slider should always be selected in the output When they are close interesting things happen You may even be able to tell that things are selected more quickly for larger differences in input values Output See the screen capture below ANAF Basal Ganglia input Q Q a Sy Q Q Q StrD1 STN StrD2 0 00 0 58 0 44 0 00 0 00 GPe 1 NWA AWM A re AAAA ahli A GPi output 1 2 199 2 699 0 0000 PD 2 6980 Code
3. 0 5 A net make A 100 1 B net make B 100 1 net connect input A net connect A B net add_to world 18 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 e The following demo scripts create models similar to those seen in this part of the tutorial dem o singleneuron py shows what happens with an ensemble with only a single neuron on it poor representation dem dem dem dem dem dem o twoneurons py shows two neurons working together to represent o manyneurons py shows a standard ensemble of 100 neurons representing a value o communication py shows a communication channel o addition py shows adding two numbers o 2drepresentation py shows 100 neurons representing a 2 D vector o combining py shows two separate values being combined into a 2 D vector 1 3 Non Linear Transformations 1 3 1 Functions of one variable e We now turn to creating nonlinear transformations in Nengo The main idea here is that instead of using the X origin we will create a new origin that estimates some arbitrary function of X This will allow us to estimate any desired function The accuracy of this estimate will of course be dependent on the properties of the neurons For one dimensional ensembles we can calculate various 1 dimensional functions f z 2 f x O c thresholding f t a e To perform a non linear operation we need to define a new origin The X origin ju
4. if True saves data from a created ensemble and will re use it in the future when creating an ensemble with the same parameters as this one If None uses the Network default setting Note the use of this parameter is not encouraged any more use seed lt number gt or fixed_seed lt number gt in the Network constructor instead e seed int random number seed to use Will be passed to both random seed and ca nengo math PDFTools setSeed If this is None and the Network was constructed with a seed parameter a seed will be randomly generated e storage_code string an extra parameter to allow different quick files even if all other parameters are the same e add_to_network boolean flag to indicate if created ensemble should be added to the network Returns the newly created ensemble make_array name neurons length dimensions 1 args Create and return an array of ensembles This acts like a high dimensional ensemble but actually con sists of many sub ensembles each one representing a separate dimension This tends to be much faster to create and can be more accurate than having one huge high dimensional ensemble However since the neurons represent different dimensions separately we cannot compute nonlinear interactions between those dimensions Note When forming neural connections from an array to another ensemble or another array any specified function to be computed with be computed on each ensemble in
5. name gate2 net add_to_nengo 2 5 Learning 2 5 1 Learning a Communication Channel Purpose This is the first demo showing learning in Nengo It learns the same circuit constructed in the Communication Channel demo Comments The particular connection that is learned is the one between the pre and post populations This particular learning rule is a kind of modulated Hebb like learning see Bekolay 2011 for details Note The red and blue graph is a plot of the connection weights which you can watch change as learning occurs you may need to zoom in with the scroll wheel the learning a square demo has a good example Typtically the largest changes occur at the beginning of a simulation Red indicates negative weights and blue positive weights Usage When you run the network it automatically has a random white noise input injected into it So the input slider 2 5 Learning 69 Nengo Documentation Release 1 3 moves up and down randomly However learning is turned off so there is little correlation between the representation of the pre and post populations Turn learning on To allow the learning rule to work you need to move the switch to 1 Because the learning rule is modulated by an error signal if the error is zero the weights won t change Once learning is on the post will begin to track the pre Monitor the error When the switch is 0 at the beginning of the simulation ther
6. 0 0 Create a controllable 2 D input with a starting value of 0 0 net make neurons 100 2 Create a population with 100 neurons representing 2 dimensions ne ne connect input neurons Connect the input to the neurons add_to_nengo 2 2 Simple Transformations 2 2 1 Communication Channel Purpose This demo shows how to construct a simple communication channel Comments A communication channel attempts to take the information from one population and put it in the next one The transformation is thus the identity f x x Notably this is the simplest neural circuit in the demos This is because the connection from the first to second population is only connection weights that are applied to postsynaptic currents PSCs generated by incoming spikes Usage Grab the slider control and move it up and down to see the effects of increasing or decreasing input Both populations should reflect the input but note that the second population only gets input from the first population through synaptic connections Output See the screen capture below 2 2 Simple Transformations 43 Nengo Documentation Release 1 3 Communications Channel input E 2 3 271 3 771 input gt A _____ _ B 2 2 2 2 0 45 3 271 3 771 3 271 3 771 0 0300 _ __ _ _ __ ee O a 4 0300 Code import nef net nef Network Communi
7. Function 0 5 04 0 3 0 2 0 1 0 0 0 1 0 2 0 3 0 4 0 5 0 6 0 7 0 8 0 9 1 0 0 00 0 05 0 10 0 15 0 20 0 25 0 30 0 35 0 40 0 45 0 50 0 55 0 60 0 65 0 70 0 75 0 80 0 85 0 90 0 95 1 00 Input Output Return to Interactive Plots and run the simulation 1 A o 1 input 10 0 00 input 10 o 1 0 0000 c 5 1 0450 1 4 4 Adjusting Synaptic Time Constants e You can adjust the accuracy of an integrator by using different neurotransmitters e Change the input termination to have a tau of 0 01 10ms GABA and a transform of 0 01 using the inspector Also change the feedback termination to have a tau of 0 01 but leave its transform at 1 input 0 00 input 10 0 0000 E gt 1 0120 e By using a shorter time constant the network dynamics are more sensitive to small scale variation i e noise e This indicates how important the use of a particular neurotransmitter is and why there are so many different types with vastly differing time constants 1 4 Feedback and Dynamics 27 Nengo Documentation Release 1 3 AMPA 2 10ms GABAi subscript A 10 20ms NMDA 20 150ms The actual details of these time constants vary across the brain as well We are collecting empirical data on these from various sources at http ctnsrv uwaterloo ca cnrglab node 246 e You can also run this examp
8. import nef import nps D 5 2 4 Basal Ganglia Based Simulations 57 Nengo Documentation Release 1 3 net nef Network Basal Ganglia Create the network object net make_input input 0 D Create a controllable input function with a starting value of 0 for each of D dimensions net make output 1 D mode direct Make a population with 100 neurons 5 dimensions and set the simulation mode to direct nps basalganglia make_basal_ganglia net input output D same_neurons False neurons 50 Make a basal ganglia model with 50 neurons per action net add_to_nengo 2 4 2 Cycling Through a Sequence Purpose This demo uses the basal ganglia model to cycle through a 5 element sequence Comments This basal ganglia is now hooked up to a memory This allows it to update that memory based on its current input action mappings The mappings are defined in the code such that A gt B B gt C etc until E gt A completing a loop This uses the spa module from Nengo The utility graph shows the utility of each rule going into the basal ganglia The rule graph shows which one has been selected and is driving thalamus Usage When you run the network it will go through the sequence forever It s interesting to note the distance between the peaks of the selected items It s about 40ms for this simple action We like to make a big deal of this Output See the scree
9. The control in this circuit changes the freq variable in that equation Usage When you run this demo it will automatically put in a step function on the input to start the oscillator moving 2 3 Dynamics 55 Nengo Documentation Release 1 3 You can see where it is in phase space in the XY plot if you want to see that over time right click on the Oscillator population and select X gt value You can change the frequency of rotation by moving the visible slider Positive values rotate clockwise and negative values counter clockwise Output See the screen capture below Code Controlled Oscillator Start Oscillator 0 016 0 516 Speed gt SpeedNeurons Oscillator X 2 Oscillator Speed SpeedNeurons 1 00 0 0000 0 5150 import nef speed 10 Base frequency of oscillation tau 0 1 Recurrent time constant net nef Network Controlled Oscillator Make controllable inputs net make_input Start 1 0 zero_after_time 0 15 Kick it to get it going net make_input Speed 1 Control the speed interactively Make two populations one for freq control and one the oscillator net make Oscillator 500 3 radius 1 7 net make SpeedNeurons 100 1 Connect the elements of the network net connect Start Oscillator index_post 0 1 net connect Speed SpeedNeurons
10. to set the value to one of the elements in the vocabulary defined as a b c d or e in the code by typing it in Each element in the vocabulary is a randomly chosen vector Set a different value for the two input graphs The C population represents the output of a neurally computed circular convolution i e binding of the A and B input vectors The label above each semantic pointer graph displays the name of the vocabulary vectors that are most similar to the vector represented by that neural ensemble The number preceding the vector name is the value of the normalized dot product between the two vectors i e the similarity of the vectors In this simulation the most similar vocabulary vector for the C ensemble is a b The a b vector is the analytically calculated circular convolution of the a and b vocabulary vectors This result is expected of course Also of note is that the similarity of the a and b vectors alone is significantly lower Both of the original input vectors should have a low degree of similarity to the result of the binding operation The show pairs option controls whether bound pairs of vocabulary vectors are included in the graph Output See the screen capture below 2 2 Simple Transformations 49 Nengo Documentation Release 1 3 Convolution 1 00a 2 A 0 86a b 0 56d e 0 54d 0 35e 0 131 0 631 1 00b 2 1 B 0 131 0 631
11. 0 pstc False convertAngles addDecodedTermination elbow 0 1 pstc False FX addDecodedTermination inputFs 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 070 1 0 0 0 0 0 07 07 0 1 00 0 0 0 0 00 1 70 0 7 0 0 0 0 0 0 1 0 0 0 00 0 0 0 17 00 0 07 0700 0 0 7 0 0 0 0 0 0 0 gt T000 0 0 0 0 0 0 0 0 0 0 0 0 0 1 pstc False FX addDecodedTermination X1 0 704 COT TOI 0I LON LO EOLA ELI FOJ EOT 0 pstc False FX addDecodedTermination X2 0 0 0 0 0 0 0 0 0 1 0 0 pstc False FX addDecodedTermination X3 0 10d LOT 0 LOI HLO TOT y FO 0 TOTLI z 0 pstc False FX addDecodedTermination X4 Oly COI LOLs OTs LOI Oly GOL POT COLLO LOL 1 pstc False funcT addDecodedTermination shoulderRef 1 0 0 0 0 0 pstc False funcT addDecodedTermination elbowRef 0 1 0 0 0 0 pstc False funcT addDecodedTermination shoulder 0 0 0 0 0 0 pstc False funcT addDecodedTermination elbow 0 0 0 0 0 0 pstc False funcT addDecodedTermination inputTs 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 pstc False controlV addDecodedTermination controlV addDecodedTermination controlV addDecod
12. 1 0 131 0 631 0 0000 AAI 0 6300 Code import nef import nef convolution import hrr D 10 vocab hrr Vocabulary D include_pairs True vocab parse atbtctdte net nef Network Convolution Create the network object net make A 300 D Make a population of 300 neurons and 10 dimensions net make B 300 D net make C 300 D conv nef convolution make_convolution net A B C 100 Call the code to construct a convolution network using the created populations and 100 neurons per dimension net add_to_nengo 2 3 Dynamics 2 3 1 A Simple Integrator Purpose This demo implements a one dimensional neural integrator Comments This is the first example of a recurrent network in the demos It shows how neurons can be used to implement stable dynamics Such dynamics are important for memory noise cleanup statistical inference and many 50 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 other dynamic transformations Usage When you run this demo it will automatically put in some step functions on the input so you can see that the output is integrating i e summing over time the input You can also input your own values Note that since the integrator constantly sums its input it will saturate quickly if you leave the input non zero This reminds us that neurons have a finite range of representation Such saturation effects can be exploited to perf
13. 3 3 4 Encoders a k a preferred direction vectors You can specify the encoders preferred direction vectors for the neurons By default the encoders are chosen uniformly from the unit sphere Alternatively you can specify those encoders by providing a list The encoders given will automatically be normalized to unit length net make F 100 2 encoders 1 0 1 0 0 1 0 1 net make G 100 2 encoders 1 1 1 1 1 1 1 1 This allows you to make complex sets of encoders by creating a list with the encoders you want For example the following code creates an ensemble with 100 neurons half of which have encoders chosen from the unit circle and the other half of which are aligned on the diagonals import random import math encoders create an empty list to store the encoders for i in range 50 theta random uniform math pi pi choose a random direction between pi and pi encoders append math sin theta math cos theta add the encoder for i in range 50 encoders append random choice 1 1 1 1 1 1 1 1 add an aligned encoder net make G 100 2 encoders encoders create th nsembl 3 4 Speeding up Network Creation and Making Consistent Models Whenever Nengo creates an ensemble it needs to compute a decoder This is done via the NEF method of doing a least squares minimization computation For large ensembles 500 neurons or more this can take some time since
14. Nengo Documentation Release 1 3 Centre for Theoretical Neuroscience University of Waterloo May 27 2015 CONTENTS Nengo Documentation Release 1 3 Nengo http nengo ca is a software package for simulating large scale neural systems To use it you define groups of neurons in terms of what they represent and then form connections between neu ral groups in terms of what computation should be performed on those representations Nengo then uses the Neural Engineering Framework NEF to solve for the appropriate synaptic connection weights to achieve this desired compu tation This makes it possible to produce detailed spiking neuron models that implement complex high level cognitive algorithms CONTENTS 1 Nengo Documentation Release 1 3 2 CONTENTS CHAPTER ONE NENGO TUTORIAL 1 1 One Dimensional Representation 1 1 1 Installing and Running Nengo e Nengo runs on Linux OS X and Windows The only requirement is that you have Java http java com already installed on your computer Most computers already do have this installed e To download Nengo download the file from http nengo ca e To install Nengo just unzip the file e To run Nengo either Double click on nengo bat in Windows run nengo in OS X and Linux File Edit View Options Help Top Window Mouse X 460 Y 12 86 1 1 2 Creating Networks e When creating an NEF model the first step is to create a Network This will
15. connection mapping the 1D function feedback into the 2D population using the 1x2 transform net add_to_nengo 2 3 4 A Simple Harmonic Oscillator Purpose This demo implements a simple harmonic oscillator in a 2D neural population Comments This is more visually interesting on its own than the integrator but the principle is the same Here instead of having the recurrent input just integrate i e feeding the full input value back to the population we have two dimensions which interact In Nengo there is a Linear System template which can also be used to quickly construct a harmonic oscillator or any other linear system Usage When you run this demo it will sit at zero while there is no input and then the input will cause it to begin oscillating It will continue to oscillate without further input You can put inputs in to see the effects It is very difficult to have it stop oscillating You can imagine this would be easy to do by either introducing control as in the controlled integrator demo so you can stop the oscillation wherever it is or by changing the tuning curves of the neurons hint so none represent values between 3 and 3 say so when the state goes inside the e g 3 radius circle the state goes to zero Also if you want a very robust oscillator you can increase the feedback matrix to be slightly greater than identity Output See the screen capture below You will get a sine and cosine i
16. o 01_ Is Modulatory C Enable Ok Cancel e For example two 1 dimensional values can be combined into a single two dimensional ensemble This would be done with two terminations one with a transformation or coupling matrix of 1 0 and the other with 0 1 If the two inputs are called a and b this will result in the following calculation ax 1 0 b 0 1 a 0 0 b a b This will be useful for creating non linear transformations as discussed further in the next section e There are additional ways to view 2D representations in the interactive plots including Plotting the activity of the neurons along their preferred direction vectors Plotting the 2D decoded value of the representation x input _ gt neurons SS 1 1 968 2 468 0 0000 t 2 4670 1 2 8 Scripting e Along with the ability to construct models using this point and click interface Nengo also provides a Python scripting language interface for model creation These examples can be seen in the demo directory e To create the communication channel through the scripting interface click on the Folder icon top left navigate to the demo directory inside the Nengo home folder and open communication py e The actual code for this can be seen by opening the communication py file in a text editor import nef net nef Network Communications Channel input net make_input input
17. ok canei e Click on Set Functions e Select User defined Function and press Set Function 0 i I User defined Function z Set Preview Ok Cancel e For the Expression enter x0 x1 Expression xO x1 Input Dimensions Registered Functions min X New Remove Preview Ok Cancel e Press OK OK and OK to finish creating the origin 22 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 e Connect the new origin to the termination on the result ensemble File Edit View Options Help Nonlinear Network Viewer e e d eee the Combined Result inputB Nonlinear Network gt Nonlinear Network e Run the simulation with Interactive Plots and view the appropriate graphs e The result should be approximately 40 e Adjust the input controls to multiple different numbers together 10 Ant 10 5 419 5 919 inputA gt A Combined Result 5 38 a 100 inputB gt B e ie aa 10 5 419 5 919 4 04 100 5 419 5 919 2 0740 Ly 6 0740 e You can also run this example by opening demo multiplication py 1 3 3 Combined Approaches e We can combine these two approaches in order to compute more complex funxtions such as x Drag an Origin onto the ensemble representing the first of the two values Give it the name square set its output dimensions to 1 and press Set Functions As before select
18. self hinge2 self elbow hingeAngle self upperarm physics setSleepingThresholds 0 0 self lowerarm physics setSleepingThresholds 0 0 def update_neurons self while True scale 0 0003 ml controlU getOrigin ul getValues getValues 0 scale m2 controlU getOrigin u2 getValues getValues 0 scale vl Vector3f 0 0 0 v2 Vector3f 0 0 0 java lang System out printin m1 f m2 m1 m2 self upperarm physics applyTorqueImpulse Vector3f 0 0 m1 self lowerarm physics applyTorqueImpulse Vector3f 0 0 m2 self hingel self shoulder hingeAngle pi 2 self hinge2 self elbow hingeAngle java lang System out printin anglel 4f fangle2 f self hingel self hinge2 self upperarm physics getAngularVelocity v1 self lowerarm physics getAngularVelocity v2 put bounds on the velocity possible af view gt 2 self upperarm physics setAngularVelocity Vector3f 0 0 2 IE yLl 2 HLS self upperarm physics setAngularVelocity Vector3f 0 0 2 if v2 2 gt 2 self lowerarm physics setAngularVelocity Vector3f 0 0 2 LE v2 2 lt 2 self lowerarm physics setAngularVelocity Vector3f 0 0 2 2 6 Miscellaneous 83 Nengo Documentation Release 1 3 self upperarm physics getAngularVelocity v1 self lowerarm physics getAngularVelocity v2 wt Transform self target physics motionState getWorld wt setIdentity Transform wt tx controlV getTermination inputRe
19. 0 00SQUARE 1 B 1 0 347 0 847 O _ gt __ C 0 00BLUE mu 1 0 347 0 847 gt F 0 41SQUARE 1 1 00BLUE I E 0 347 0 847 1 0 347 0 847 0 0000 119 ee 1 0790 Code D 16 subdim 4 N 100 seed 7 import nef import nef convolution import hrr import math import random random seed seed vocab hrr Vocabulary D max_similarity 0 1 net nef Network Question Answering with Memory Create the network object net make A 1 D mode direct Make some pseudo populations so they run well on less powerful machines 1 neuron 16 dimensions direct mode net make B 1 D mode direct net make_array C N D subdim dimensions subdim quick True radius 1 0 math sqrt D Make a real population with 100 neurons per array element and D subdim elements in the array feach with subdim dimensions set the radius as appropriate for multiplying things of this dimension net make E 1 D mode direct net make F 1 D mode direct convil nef convolution make_convolution net A B C N quick True Make a convolution network using the construct populations 2 4 Basal Ganglia Based Simulations 65 Nengo Documentation Release 1 3 conv2 nef convolution make_convolution net C E F N invert_second True quick True Make a correlation network by using convolution
20. Create a controllable input with a starting value of 45 net make neuron neurons 1 dimensions l1 max_rate 100 100 intercept 0 5 0 5 encoders 1 noise 3 Make 1 neuron representing 1 dimension with a maximum firing rate of 100 witha tuning curve x intercept of 0 5 encoder of 1 i e it responds more to positive values and a noise of variance 3 net connect input neuron Connect the input to the neuron net add_to_nengo 2 1 2 Two Neurons Purpose This demo shows how to construct and manipulate a complementary pair of neurons Comments These are leaky integrate and fire LIF neurons The neuron tuning properties have been selected so there is one on and one off neuron Usage Grab the slider control and move it up and down to see the effects of increasing or decreasing input One neuron will increase for positive input and the other will decrease This can be thought of as the simplest population to give a reasonable representation of a scalar value Output See the screen capture below 40 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 Two Neurons iit i input gt _ neuron 0 58 1 096 1 596 2 1 096 1 596 0 0000 k9 ee 1 5950 Code import nef net nef Network Two Neurons Create the network net make_input input 0 45 Create a controllable input with
21. Multiplication is extremely powerful Following the simple usage instructions below suggests how you can exploit it to do gating of information into a population as well as radically change the response of a neuron to its input i e completely invert its tuning to one input dimension by manipulating the other Usage Grab the slider controls and move them up and down to see the effects of increasing or decreasing input The output is the product of the inputs To see this quickly leave one at zero and move the other Or set one input at a negative value and watch the output slope go down as you move the other input up Output See the screen capture below Multiply inputA 10 10 12 539 13 039 v os rr a 5 63 S Combined gt D eae 100 inputB gt B L inputB 10 100 0 61 ak 12 539 13 039 12 539 13 039 9 2390 a S a 13 2390 pdf data mode time step speed _ recording time filter time shown layout A fF default 0 001 Foix ale 0 03 os k e Code import nef net nef Network Multiplication Create the network object net make_input input A 8 Create a controllable input function with a starting value of 8 net make_input input B 5 Create a controllable input function with a starting value of 5 net make A 100 1 radius 10 Make a population with 100 neurons 1 dimensions a radius of 10 default is 1 net make B 100 1 radius 10 Make a po
22. input with a weight of 7 which is the same as the synaptic time constant of the neurotransmitter used e Create a one dimensional ensemble called Integrator Use 100 neurons and a radius of 1 e Add two terminations with synaptic time constants of 0 1s Call the first one input and give it a weight of 0 1 Call the second one feedback and give it a weight of 1 e Create a new Function input using a Constant Function with a value of 1 e Connect the Function input to the input termination e Connect the X origin of the ensemble back to its own feedback termination 24 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 File Edit View Options Help Top Window Mouse X 915 77 Y 2 673 22 e Go to Interactive Plots Create a graph for the value of the ensemble right click on the ensemble and select value e Press Play to run the simulation The value stored in the ensemble should linearly increase reaching a value of 1 after approximately 1 second You can increase the amount of time shown on the graphs in Interactive Plots Do this by clicking on the small downwards pointing arrow at the bottom of the window This will reveal a variety of settings for Interactive Plots Change the time shown to 1 input gt A 0 0000 1 5980 1 4 2 Representation Range e What happens if the previous simulation runs for longer than one second e The value stored in the ensemble does not i
23. intercept 1 0 radius 1 encoders 1 quick useQuick 112 Chapter 4 Advanced Nengo Usage Nengo Documentation Release 1 3 def addOne x return x 0 1 net connect thal None func addOne origin_name xBiased create_projection False The last step to make the template appear in the Nengo interface is to add it to the list in python nef templates __init__ py 4 3 Running Experiments in Nengo Once a model has been designed we often want to run controllable experiments to gather statistics about the perfor mance of the model As an example we may want to read the input data from a file and save the corresponding outputs to a separate file This allows us to automate the process of running these simulations rather than using the interactive plots viewer 4 3 1 Inputs Input can be provided for the simulation using the nef Simp1leNode approach covered in more detail in Adding Arbitary Code to a Model Here we create an origin that has a fixed set of input stimuli and shows each one for 100 milliseconds each inputs dt 0 001 steps_per_input 100 class Input nef SimpleNode def origin_input self dimensions 2 step int round self t dt find time step we are on index step steps_per_input len inputs find stimulus to show return inputs index This origin will cycle through showing the values 1 1 1 1 1 1 and 1 1 for 0 1 seconds each Instead of manuall
24. nef Network make_array or a Termination transform array of floats The linear transfom matrix to apply across the connection If transform is T and pre represents x then the connection will cause post to represent Tx Should be an N by M array where N is the dimensionality of post and M is the dimensionality of pre but a 1 dimensional array can be given if either N or M is 1 e pstc float post synaptic time constant for the neurotransmitter receptor implementing this connection e weight float scaling factor for a transformation defined with index_pre and index_post Ignored if transform is not None See nef Network compute_transform e index_pre list of integers or a single integer the indexes of the pre synaptic dimensions to use Ignored if transform is not None See nef Network compute_transform index_post list of integers or a single integer the indexes of the post synaptic dimensions to use Ignored if transform is not None See nef Network compute_transform e func function function to be computed by this connection If None computes f x x The function takes a single parameter x which is the current value of the pre ensemble and must return wither a float or an array of floats For example 3 8 List of Classes 105 Nengo Documentation Release 1 3 def square x return x 0 x 0 net connect A B func square def powers x return x 0 x 0 2 x 0 3 net c
25. power 0 5 seed 12 Create a white noise input function 1 base freq max freq 10 rad s and RMS of 5 12 is a seed net connect input pre Create a modulated connection between the pre and post ensembles learning make net errName error N_err 100 preName pre postName post rate 5e 4 Make an error population with 100 neurons and a learning rate of 5e 4 Set the modulatory signal t connect pre error net connect post error weight 1 Add a gate to turn learning on and off net make_input switch 0 Create a controllable input function with a starting value of 0 gating make net name Gate gated error neurons 40 pstc 0 01 Make a gate population with 100 neurons and a postsynaptic time constant of 10ms net connect switch Gate Add another non gated error population running in direct mode net make actual error 1 1 mode direct Make a population with 1 neurons 1 dimensions and run in direct mode connect pre actual error connect post actual error weight 1 ne net add_to_nengo 2 5 2 Learning Multiplication Purpose This is demo shows learning a familiar nonlinear function multiplication Comments The set up here is very similar to the other learning demos The main difference is that this demo learns a nonlinea
26. rons Note The neurons parameter specifies the number of neurons in each ensemble not the total number of neurons The resulting array can be used just like a normal ensemble The following example makes a single 10 dimensional ensemble and a network array of 5 two dimensional ensembles and connects one to the other net make_array A neurons 100 length 5 dimensions 2 net make B neurons 500 dimensions 10 net connect A B When computing nonlinear functions with an array the function is applied to each ensemble separately The following computes the products of five pairs of numbers storing the results in a single 5 dimensional array 98 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 net make_array A neurons 100 length 5 dimensions 2 net make B neurons 500 dimensions 5 def product x return x 0 x 1 net connect A B func product 3 7 6 Matrix operations To simplify the manipulation of matrices we have added a version of JNumeric to Nengo This allows for a syntax similar to Matlab but based on the NumPy python module To use this for matrix manipulation you will first have to convert any matrix you have into an array object a 1 2 3 4 5 6 old method a array 1 2 3 4 5 6 new method You can also specify the storage format to be used as follows a array 1 2 3 4 5 6 typecode f valid val
27. value numpy array function x for x in x_values find the optimum linear decoder A numpy array A T Gamma numpy dot A A T Upsilon numpy dot A value Ginv numpy linalg pinv Gamma decoder numpy dot Ginv Upsilon dt return decoder find the decoders for A and B decoder_A compute_decoder encoder_A gain_A bias_A function function decoder_B compute_decoder encoder_B gain_B bias_B compute the weight matrix weights numpy dot decoder_A encoder_B HedeHeeee eee ee ee eeeed eset eaed eset d ada ded dd ttt ttt Step 2 Running the simulation Hed AHH HEHEHE EERE EEA Ra aaa a aE aE HE v_A 0 0 N_A voltage for population A ref_A 0 0 N_A refractory period for population A input_A 0 0 N_A input for population A v_B 0 0 N_B voltage for population B ref_B 0 0 N_B refractory period for population B input_B 0 0 N_B input for population B scaling factor for the post synaptic filter pstc_scale 1 0 math exp dt t_pstc for storing simulation data to plot afterward inputs times outputs ideal output 0 0 the decoded output value from population B t 0 130 Chapter 5 The NEF Algorithm Nengo Documentation Release 1 3 while t lt 10 0 call the input function to determine the input value x input t convert the input value into an input for each neuron for i in range N_A input_A i x encoder_A i gain_A i bias_A i
28. weight index_pre and index_post weight specifies the overall gain on the connection across all dimensions and defaults to 1 For example 96 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 net make A 100 3 net make B 100 3 net connect A B weight 0 5 makes a transform matrix of 0 5 0 0 0 0 5 0 0 0 0 5 Note that the system by default assumes the identity matrix for the connection If you don t want the identity matrix and would prefer some other connectivity specify index_pre and index_post These indicate which dimensions in the first ensemble should be mapped to which dimensions in the second ensemble For example net make A 100 3 net make B 100 1 net connect A B index_pre 2 makes a transform matrix of 0 0 1 net make A 100 1 net make B 100 3 net connect A B index_post 0 makes a transform matrix of 1 0 0 net make A 100 4 net make B 100 2 net connect A B index_pre 1 2 makes a transform matrix of 0 1 0 0 0 0 1 0 which makes B hold the 2nd and 3rd element of A net make A 100 4 net make B 100 3 net connect A B index_pre 1 2 index_post 0 1 makes a transform matrix of 0 1 0 0 0 0 1 0 0 0 0 0 which makes B hold the 2nd and 3rd element of A in its first two elements 3 7 2 Adding noi
29. 4 Advanced Nengo Usage Nengo Documentation Release 1 3 e We can use a similar approach to plot the average activity for varying values of N import stats s stats Stats experiment Test for N ain 5 10 20 50 data s data N N plot data time 0 numpy mean data A axis 1 label 4d N legend loc best 4 7 8 Computing summary data e We often want to look at data that s more high level than the raw time varying output For example we might want to determine the representation accuracy of the model We can do this by writing a function that does our analysis It should expect 2 inputs a Reader object and a dictionary holding any other computed results e This particular example computes the mean squared error between the input and A values within 5 different 100ms time windows import stats s stats Stats experiment Test def error data computed errors for t ain 0 1 0 3 0 5 0 7 049 A data get A t t 0 1 ideal data get input t t 0 1 errors append sqrt mean ideal A 2 return mean errors s compute error error e We can now see how the error changes as N varies by doing import stats s stats Stats experiment Test N 5 10 20 50 plot N mean s computed N n error for n in N 4 7 Integrating with IPython Notebook 125 Nengo Documentation Release 1 3 126 Chapter 4 Advanced Nengo Usage CHAPTER FIVE THE
30. 5 rae eR O 1 0 70 B eae eee gt 0 0 5 1 1 inputl _ gt _ A a 2 OH 1 0 50 o 0 5 0 0000 W 0 3820 This will be true for most values However if the sum is outside of the radius that was set when the neural group was formed in this case from 1 to 1 then the neurons may not be able to fire fast enough to represent that value i e they will saturate Try this by computing 1 1 The result will only be around 1 3 e To accurately represent values outside of the range 1 to 1 we need to change the radius of the output ensemble Return to the Nengo Workspace and configure ensemble B Change its radii to 2 Now return to the Interactive Plots The network should now accurately compute that 1 1 2 1 2 6 Adjusting Transformations e So far we have only considered projections that do not adjust the values being represented in any way However due to the NEF derivation of the synaptic weights between neurons we can adjust these to create arbitrary linear transformations i e we can multiply any represented value by a matrix 16 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 e Each termination in Nengo has an associated transformation matrix This can be adjusted as desired In this case we will double the weight of the original value so instead of computing x y the network will compute 2x y e Right click on the first termination in the ensemble that has two proj
31. CCR EE left input HUVEC MAORA MEAL UU UU CEEOL CUE TEE ANONN DODOA WAWAO ODADAN ee UCU TUETU ETE ETNEA ETI EET TELE EEE ELT HELE Lay Thi D A hhina haab nanan TORR PEP CPEUE CUP CTP UE EE ee ie ae eco 1 368 1 568 1 368 1 568 right motor Hhdtde right input ihhh TET TE TIRITRTIRTR TRIER TTT RT TTT eee an Vit Haiti tt AOA OAOA ON OB OU 0A WA DA DOUA OU OO ON O0 OOOO O0 O0 OO U OO O OO OA DOWOD OO OO OO DO N DO O MUDIO MN k AININ OAONDON OVON OBOU VONUA HEE ONDON OEE DONO VUDA EE ONDON VOOD Le L L l d LA d UNNUNiANi NNN 1 368 1 568 1 368 1 568 pdf data mode time step speed recording time filter time shown A eefau 3 0001 53 Gom 216 0 03 6 0 2 6 Code from _ future_ import generators import nef import space from java awt import Color import ccm import random dt 0 001 N 10 class Bot space MD2 def _ init__ self space MD2 __ init__ self python md2 dalekx tris md2 python md2 dalekx imperial png scale 0 02 mass 800 overdraw_scale 1 4 wheels space Wheel space Wheel space Wheel space Wheel s 0 z radius r s 0 z radius r 0 s z friction 0 radius r 0 s z friction 0 radius r def start self self sch add space MD2 start args self self rangel space RangeSensor 0 3 1 0 maximum 6 self range2 space RangeSensor 0 3 1 0 maximum 6 self wheel1 0 self wheel2 0 76 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 while True rl self rangel range
32. Gate 4 Lom ea E 1 05 l rtor ay SP ee i sial input ee p 1 14 309 14 809 SS ES ee ee 9 input gt pre _ gt post 0 40 0 44 i input 5 pre X gt post 1 2 4 14 309 14 809 14 309 14 809 14 309 14 809 13 9390 i KI Onan eraa PAI 14 9390 x pdf data mode time step speed recording time filter time shown layout A eefaue 3 Goons om 1e 0 02 os W e Code N 60 D 2 import nef import nef templates learned_termination as learning import nef templates gate as gating import random random seed 27 net nef Network Learn Square Create the network object Create input and output populations net make pre N D Make a population with 60 neurons 1 dimensions net make post N D Make a population with 60 neurons 1 dimensions Create a random function input net make_fourier_input input dimensions D base 0 1 high 8 power 0 4 seed 0 Create a white noise input function 1 base freq max freq 10 rad s and RMS of 4 0 is a seed net connect input pre Create a modulated connection between the pre and post ensembles learning make net errName error N_err 100 preName pre postName post rate 5e 4 Make an error population with 100 neurons and a learning 74 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 rate of 5e 4 Set the modulatory signal to compute the desired f
33. Learn Product Create the network object Create input and output populations net make pre N D Make a population with 60 neurons 1 dimensions net make post N 1 Make a population with 60 neurons 1 dimensions Create a random function input t make_fourier_input input dimensions D base 0 1 high 8 power 0 4 seed 0 Create a white noise input function 1 base freq max freq 10 rad s and RMS of 4 0 is a seed net connect input pre Create a modulated connection between the pre and post ensembles learning make net errName error N_err 100 preName pre postName post rate 5e 4 Make an error population with 100 neurons and a learning rate of 5e 4 72 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 Set the modulatory signal to compute the desired function def product x product 1 0 for xx in x product xx return product connect pre error func product net connect post error weight 1 Add a gate to turn learning on and off net make_input switch 0 Create a controllable input function with a starting value of 0 and 0 in the two dimensions gating make net name Gate gated error neurons 40 pstc 0 01 Make a gate population with 40 neurons and a postsynaptic time constant of 10ms net connect switch Gate net add_t
34. NEF ALGORITHM While Nengo provides a flexible general purpose approach to neural modelling it is sometimes useful to get a complete look at exactly what is going on under the hood The theory behind the Neu ral Engineering Framework is developed at length in Eliasmith amp Anderson 2003 Neural Engineering http www amazon com Neural Engineering Representation Neurobiological Computational dp 0262550601 and a short summary is available in Stewart 2012 A Technical Overview of the Neural Engineering Framework http compneuro uwaterloo ca publications stewart2012d html However for some people the best description of an algorithm is the code itself With that in mind the following is a complete implementation of the NEF for the special case of two one dimensional populations with a single connection between them You can adjust the function being computed the input to the system and various neural parameters In it written in Python and requires Numpy for the matrix inversion and Matplotlib to produce graphs of the output Code A Minimal Example of the Neural Engineering Framework The NEF is a method for building large scale neural models using realistic neurons It is a neural compiler you specify the high level computations the model needs to compute and the properties of the neurons themselves and the NEF determines the neural connections needed to perform those operations The standard software for bu
35. Used 71208944 Total 101449728 Max INFO Configuring NEFEnsemble ca nengo util Memory Used 69374440 Total 164933632 Max If the external script is not working when you create an ensemble you should see something like this 4 5 Generating Large Ensembles 500 to 5000 neurons 117 810942464 810942464 810942464 Nengo Documentation Release 1 3 INFO Configuring NEF INFO Configuring NEF File not found java io FileNotFoundException external matrix_ 3645035487712329947 inv E E INFO Configuring NE INFO Configuring NE 4 6 GPU Computing in Nengo Since neurons are parallel processors Nengo can take advantage of the parallelization offered by GPUs to speed up simulations Currently only NEF Ensembles and Network Arrays containing NEF Ensembles can benefit from GPU acceleration and they must be composed solely of leaky integrate and fire LIF neurons You can still use the GPU for Networks which contain other types of nodes but only nodes that meet these criteria will actually be executed on the GPU s the rest will run on the CPU This restriction is necessary because GPUs take advantage of a computing technique called Single Instruction Multiple Data wherein we have many instances of the same code running on different data If we are only using NEF Ensembles containing LIF neurons then we are well within this paradigm we want to execute many instances of t
36. a network Either the node object or the node name can be used as a parameter to this function Parameters node the node or name of the node to be removed Returns node removed get name default lt type exceptions Exception gt require_origin False Return the node with the given name from the network set_alias alias node Adds a named shortcut to an existing node within this network to be used to simplify connect calls For example you can do net set_alias vision A B C D E net set_alias motor W X Y 2 net connect vision motor releaseMemory Attempt to release extra memory used by the Network Call only after all connections are made getNeuronCount Return the total number of neurons in this network run time dt 0 001 Run the simulation If called twice the simulation will continue for time more seconds To reset the simulation call nef Network reset Typical use cases are to either simply call it once net run 10 or to call it multiple times in a row t 0 dt 0 1 while t lt 10 net run dt t dt Parameters time float the amount of time in seconds to run for e dt float the size of the time step to use 108 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 reset Reset the simulation Should be called if you have previously called nef Network run but now want to reset the simu lation to its in
37. being represented and the preferred direction vector see below divided by the radius is greater than the intercept Note that since we divide by the radius the intercepts will always be normalized to be between 1 and 1 While this parameter can be used to help match the tuning curves observed in the system being modelled one important other use is to build neural models that can perfectly represent the value 0 For example if a 1 dimensional neural ensemble is built with intercepts in the range 0 3 1 then no neurons at all will fire for values between 0 3 and 0 3 This means that any value in that range i e any small value will be rounded down to exactly 0 This can be useful for optimizing thresholding and other functions where many of the output values are zero By default intercepts are uniformly distributed between 1 and 1 The intercepts can be specified by providing either a range or a list of values net make G 100 2 intercept 1 1 net make H 100 2 intercept 0 8 0 4 0 4 0 8 Note The type of brackets used is important Python has two types of brackets for this sort of situation round brackets and square brackets Round brackets create a tuple which we use for indicating a range of values to randomly choose within and square brackets create a list which we use for specifying a list of particular value to use 3 3 Configuring Neural Ensembles 91 Nengo Documentation Release 1 3
38. called tick Here is a simple example where this function simply prints the current time class Time nef SimpleNode def tick self print The current time in the simulation is self t As a more complex example here is a tick function used to save spike raster information to a text file while the simulation runs class SpikeSaver nef SimpleNode def tick self f file data csv at data A getOrigin AXON getValues getValues f write 41 3f s n self t list data close net nef Network Spike Saver example A net make A 50 1 saver net add SpikeSaver saver 3 6 Connection Weights 3 6 1 Viewing synaptic weights For most Nengo models we do not need to worry about the exact connection weights as Nengo solves for the optimal connection weights using the Neural Engineering Framework However we sometimes would like to have direct access to this weight matrix so we can modify it in various ways We do this using the weight_func argument in the nef Network connect function For example to simply print the solved for optimal weights we can do the following net make A 100 1 net make B 100 1 def print_weights w 3 6 Connection Weights 95 Nengo Documentation Release 1 3 print w return w net connect A B weight_func print_weights 3 6 2 Adjusting weights We can also adjust these weights by modifying
39. contain all of the neural ensembles and any needed inputs to the system File gt New Network or drag a network from the sidebar Nengo Documentation Release 1 3 Give the network a name File Edit View Options Help rr ee Basic Network Network Viewer Basic Network Network gt Basic Network Network e You can create networks inside of other networks This can be useful for hierarchical organization of models 1 1 3 Creating an Ensemble e Ensembles must be placed inside networks in order to be used e Drag an ensemble from the sidebar and drop it in the network Templates last used 7 New Remove Name Ensemble Number of Nodes Dimensions 2 Node Factory Linear Neuron v Set Radius 1 0 Ok l Cancel I Advanced e Here the basic features of the ensemble can be configured Name Number of nodes i e neurons Dimensions the number of values in the vector encoded by these neurons leave at for now Radius the range of values that can be encoded for example a value of 100 means the ensemble can encode numbers between 100 and 100 e Node Factory the type of neuron to use 4 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 Templates last used New Remove tauRC tauRef Max rate Low 100 0 High 200 0 Intercept Low 1 0 High 1 0 r Ok Cancel e For this tutorial and for the majority of our research we use LIF Neuron
40. def A state A e g If in state A set state B then go to state B def B state B set state C def C state C set state D def D state D set state E A def E state E set state class Routing SPA Define an SPA model cortex basal ganglia thalamus dimensions 16 state Buffer Create a working memory recurrent network fobject i e a Buffer vision Buffer feedback 0 Create a cortical network object with no recurrence so no memory properties just transient states BG BasalGanglia Rules Create a basal ganglia with the prespecified set of rules thal Thalamus BG Create a thalamus for that basal ganglia so it uses the same rules input Input 0 1 vision 0 8 START D Define an input set the input to state 0 8 START D for 100 ms model Routing 2 4 4 A Question Answering Network Purpose This demo shows how to do question answering using binding i e see the convolution demo Comments This example binds the A and B inputs to give C Then the E input is used to decode the contents of C and the result is shown in F Essentially showing unbinding gives back what was bound Note The b w graphs show the decoded vector values not neural activity So the result in F should visually look like the A element if B is being unbound from C Usage When you run the network it will start by binding RED and CIRC
41. inputs to 0 0 0 6 0 0 Run simulation for a bit then pause it Set the inputs to 0 0 0 6 1 0 Continue simulation Measure how long it takes for the neurons for the fourth action to stop firing input R StrD2 STN Thor GPi StrD1 2 b 0 341 0 841 0 341 0 841 0 0000 E 9 0 8400 Inrats 14 17ms In model 14ms or more if the injected current isn t extremely large 40 30 20 10 std dev 0 1 02 03 04 05 06 0 7 08 09 10 15 2 0 utility difference latency ms e For details see Stewart et al 2010 http ctnsrv uwaterloo ca cnrglab node 53 34 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 1 5 4 Sequences of Actions e To do something useful with the action selection system we need two things A way to determine the utility of each action given the current context Away to take the output from the action selection and have it affect behaviour e We do this using the representations of the semantic pointer architecture Any cognitive state is represented as a high dimensional vector a semantic pointer Working memory stores semantic pointers using an integrator Calculate the utility of an action by computing the dot product between the current state and the state for the action i e the IF portion of an IF THEN production rule x This is a linear operation so we can directly compute it using the connection weights betw
42. neurons with similar firing patterns are near each other as they are in the brain This does not otherwise affect the simulation in any way e Run the simulation and change the input This will affect the neuron voltage input gt A gt B 1 KAVN 1 iy 0 46 4 815 5 315 1 3140 l 5 3140 e So far we have just been graphing information about neural ensemble A We have shown that these 100 neurons can accurately represent a value that is directly input to them e For this to be useful for constructing cognitive models we need to also show that the spiking output from this group of neurons can be used to transfer this information from one neural group to another In other words we want to show that B can represent the same thing as A where B s only input is the neural firing from group A For this to happen the correct synaptic connection weights between A and B as per the Neural Engineering Framework must be calculated Nengo automatically calculates these weights whenever an origin is created e We can see that this communication is successful by creating graphs for ensemble B Do this by right clicking on B and selecting value and then right clicking on B again and selecting voltage grid To aid in identifying which graph goes with which ensemble right click on a graph and select label Graphs can be moved by dragging and resized by dragging near the edges and corners or
43. of 15 called Combined Since it needs to represent multiple values we increase the number of neurons it contains to 200 e Add two terminations to Combined For each one the input dimensions are 1 For the first one use Set Weights to make the transformation be 1 0 For the second one use Set Weights to make the transformation be 0 1 e Connect the two other ensembles to the Combined one File Edit View Options Help Nonlinear Network Viewer faee e inputA A 100 nodes fe 4 see r inputB Nonlinear Network gt Nonlinear Network e Next create an ensemble to represent the result It should have a radius of 100 since it will need to represent values from 100 to 100 i e max 10 10 Give it a single one dimensional termination with a weight of 1 1 3 Non Linear Transformations 21 Nengo Documentation Release 1 3 File Edit View Options Help Nonlinear Network Viewer f x e gt inputA 100 nodes i e f x eo gt 0 04 inputB B Combined Nonlinear Network gt Nonlinear Network e Now we need to create a new origin that will estimate the product between the two values stored in the combined ensemble Drag the Origin icon onto the Combined ensemble Set the name to product Set Output dimensions to 1 Templates o nen Remove Name product Functions Output Dimensions 1 Set Functions Node Origin Name AXON v
44. origins terminations ensemble factories and so on needed to create this network Parameters e name string or NetworkImpl If a string create and wrap a new NetworkImpl with the given name If an existing NetworkImpl then create a wrapper around that network e quick boolean Default setting for the quick parameter in nef Network make e fixed_seed int random number seed to use for creating ensembles Every ensemble will use this seed value resulting in identical neurons in ensembles that have the same parameters By default setting this parameter will enable quick mode e seed int random number seed to use for creating ensembles This one seed is used only to start the random generation process so each neural group created will be different unlike the fixed_seed parameter By default setting this parameter will enable quick mode make name neurons dimensions tau_rc 0 02 tau_ref 0 002 max_rate 200 400 intercept I 1 radius 1 encoders None decoder_noise 0 10000000000000001 eval_points None noise None noise_frequency 1000 mode spike add_to_network True node_factory None decoder_sign None seed None quick None storage_code Create and return an ensemble of neurons Parameters e name string name of the ensemble must be unique e neurons integer number of neurons in the ensemble e dimensions integer number of dimensions to represent e tau_rc float me
45. r2 self range2 range input1l functions 0 value r1 1 8 input2 functions 0 value r2 1 8 fl motorl getOrigin xX getValues getValues 0 f 2 motor2 getOrigin X getValues getValues 0 self wheels 1 force f1 600 self wheels 0 force f2 600 yield dt class Room space Room def __init_ self space Room __init__ self 10 10 dt 0 01 def start self self bot Bot self add self bot 0 0 1 view space View self 0 10 5 for i in range 6 self add space Box 1 1 1 mass 1 color Color 0x8888FF flat_shading False random uniform 5 5 random uniform 5 5 random uniform 4 6 self sch add space Room start args self from ca nengo util impl import NodeThreadPool NEFGPUInterface net nef Network Braitenberg inputl net make_input right eye 0 input2 net make_input left eye 0 sensel net mak sense2 net mak motorl net make motor2 net make right input N 1 left input N 1 right motor N 1 left motor N 1 net connect input1l sensel net connect input2 sense2 net connect sense2 motor1 net connect sensel motor2 net add_to_nengo r ccm nengo create Room net add r 2 6 2 Arm Control Purpose This demo shows an example of having an interaction between a neural and non neural dynamical simula tion 2 6 Miscellaneous 77 Nengo Documentation Release 1 3 Comments The majority of
46. run on the CPU 2 You can also set the number of GPUs to use for simulation in a python script This is useful if you want to ensure that a given network created by a script and maybe even run in that script always runs with the same number of devices To achieve this add the following line to your script ca nengo util impl NEFGPUInterface setRequestedNumDevices x where x is the number of devices you want to use for the resulting network 4 6 GPU Computing in Nengo 121 Nengo Documentation Release 1 3 3 GPU simulations can be combined with CPU multithreading In the parallelization dialog it lets you select the number of CPU threads to use All NEF Ensembles that are set to run on the GPU will run there and the rest of the nodes in the Network will be parallelized via multithreading This is especially useful for speeding up Simple Nodes that do a lot of computation The optimal number of threads will vary greatly depending on the particular network you are running and the specs of your machine and generally takes some experimentation to get right However using a number of threads equivalent to the number of cores on your machine is usually a good place to start 4 xxx If you installed 1ibNengoUtilsGPU and it succeeded then the parallelization dialog will have the Use GPU for Ensemble Creation checkbox enabled If you check the box and press OK then all NEF Ensembles you create afterwards will use the GPU for the
47. start self self target space Sphere 0 2 mass 1 color Color 0xFF0000 self add self target 0 0 2 torso space Box 0 1 0 1 1 5 mass 100000 draw_as_cylinder True color Color 0x4444FF self add torso 0 0 1 upperarm space Box 0 1 0 7 0 1 mass 0 5 draw_as_cylinder True color Color 0x8888FF overdraw_radius 1 2 overdraw_length 1 2 self add upperarm 0 7 0 5 2 upperarm add_sphere_at 0 0 5 0 0 1 Color 0x4444FF self upperarm add_sphere_at 0 0 5 0 0 1 Color 0x4444FF self lowerarm space Box 0 1 0 75 0 1 mass 0 1 draw_as_cylinder True color Color 0x8888FF overdraw_radius 1 2 overdraw_length 1 1 82 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 self add lowerarm 0 7 1 5 2 shoulder HingeConstraint torso physics upperarm physics Vector3f 0 7 0 1 1 Vector3f 0 0 5 0 Vector3f 0 0 1 Vector3f 0 0 1 elbow HingeConstraint upperarm physics lowerarm physics Vector3f 0 0 5 0 Vector3f 0 0 5 0 Vector3f 0 0 1 Vector3f 0 0 1 shoulder setLimit pi 2 pi 2 1 elbow setLimit pi 0 self physics addConstraint elbow self physics addConstraint shoulder upperarm physics applyTorqueImpulse Vector3f 0 0 300 lowerarm physics applyTorqueImpulse Vector3f 0 0 300 self sch add space Room start args self self update_neurons self upperarm upperarm self lowerarm lowerarm self shoulder shoulder self elbow elbow self hingel self shoulder hingeAngle
48. 2 2 2 L1 L2 law of cosines if xxx2 yxx2 lt L1 24 L2 2 D D find elbow down angles from shoulder to elbow if D lt 1andD gt 1 elbow acos D else elbow 0 if x x x2 y 2 lt L1 2 L2 2 lbow pi lbow return elbow def getDimension self return 2 class getX ca nengo math Function def map self X shoulder X 0 elbow X 1 L1 79 L2 45 return L1 cos shoulder L2 cos shoulder elbow def getDimension self return 2 class getY ca nengo math Function def map self X shoulder X 0 elbow X 1 L1 29 L2 5 return L1 sin shoulder L2 sin shouldert telbow def getDimension self return 2 input functions refX net make_input refX 1 refY net make_input refY 1 Tfunc net make_input T matrix 1 0 0 1 F net make_input F 1 0 1 0 0 1 0 1 80 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 neural populations convertXY net make convert XY N 2 convertAngles net make convert Angles N 2 funcT net make funcT N 6 FX net make FX N 12 controlV net make control signal v N 2 calculate 2D control signal controlU net make control signal u 500 2 quick True calculates jkoint torque control signal add terminations convertXY addDecodedTermination refxy 1 0 0 1 pstc False convertAngles addDecodedTermination shoulder 1
49. 5 2 Step 2 Tell Nengo where Python is For Linux and OS X Nengo should automatically be able to find Python to run it However this seems not to be the case for some versions of Windows Under Windows you will have to edit the file external pseudolInverse bat by changing this oe 3 oe python pseudolInverse 1 2 4 into this oe 3 oe C python27 python exe pseudoInverse 1 2 4 Where C python27 is the directory you installed Python into this is the default 4 5 3 Step 3 Testing You should now be able to create larger ensembles in Nengo A population with 500 neurons should take 5 10 seconds 1000 neurons should take 15 30 seconds The largest we recommend using this technique is 5000 neurons which may take up to a half hour As always you can make use of the Speeding up Network Creation and Making Consistent Models option to re use ensembles once they have been created To confirm that Nengo is using this faster approach you can examine the console output from Nengo This is the window Nengo is being run from and on Windows this is a black window showing a lot of text as ensembles are created This is not the Script Console at the bottom of the main Nengo window If the external script is working when you create an ensemble you should see something like this INFO Configuring NEFEnsemble ca nengo util Memory Used 62454552 Total 101449728 Max INFO Configuring NEFEnsemble ca nengo util Memory
50. 5220 e The pattern in the visual buffer is successfully transferred to working memory then the sequence is continued from that letter 100 90l biologically consistent 80 F values mean 7OF 4 60 J 507 40 30F q 207 Il 7 10 std dev ee ee a a 2 4 6 8 10 12 14 16 18 20 inhibitory time constant ms e Takes longer 60 70ms for these more complex productions to occur 1 5 6 Question Answering e The control signal in the previous network can also be another semantic pointer that binds unbinds the contents of the visual buffer instead of just a gating signal This more flexible control does not add processing time Allows processing the representations while routing them e This allows us to perform arbitrary symbol manipulation such as take the contents of buffer X unbind it with buffer Y and place the results in buffer Z e Example Question answering System is presented with a statement such as red triangle and blue circle x a semantic pointer representing this statement is placed in the visual cortical area statementt red triangle tbluexcircle Statement is removed after a period of time Now a question is presented such as What was Red x question red is presented to the same visual cortical area as before Goal is to place the correct answer in a motor cortex area in this case triangle e This is achieved by creating two action rules 1 5 C
51. D inputE 1 5 ZERO inputE 1 7 SQUARE inputE 1 85 BLUE inputE 2 0 ZERO inputE 2 2 CIRCLE inputE 2 35 RED 2 4 Basal Ganglia Based Simulations 63 Nengo Documentation Release 1 3 inputE 2 5 ZERO inputE 2 7 SQUARE inputE 2 85 BLUE inputE 3 0 ZERO inputE 3 2 CIRCLE inputE 3 35 RED inputE 3 5 ZERO inputE 3 7 SQUARE inputE 3 85 BLUE inputE 4 0 ZERO inputE 4 2 CIRCLE inputE 4 35 RED inputE 4 5 ZERO inputE 4 7 SQUARE inputE 4 85 BLUE net make_input inputE inputE net connect inputE E net add_to_nengo 2 4 5 A Question Answering Network with Memory Purpose This demo performs question answering based on storing items in a working memory Comments This example is very much like the question answering demo it would be good to read that However it answers questions based on past input not the immediate input Usage When you run the network it will start by binding RED and CIRCLE and then binding BLUE and SQUARE so the memory essentially has RED CIRCLE BLUE SQUARE The input from A and B is turned off and E is then run through the vocabulary The appropriately unbound element for each vocabulary word shows up in F as appropriate Output See the screen capture below 64 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 000 Question Answering with Memory
52. D inputE 3 7 SQUARE inputE 3 8 BLUE inputE 3 9 ZERO inputE 4 0 CIRCLE inputE 4 1 RED inputE 4 2 SQUARE inputE 4 3 BLUE inputE 4 4 ZERO inputE 4 5 CIRCLE inputE 4 6 RED inputE 4 7 SQUARE inputE 4 8 BLUE inputE 4 9 ZERO net make_input inputE inputE net connect inputE E net add_to_nengo 2 4 6 A Controlled Question Answering Network Purpose This demo performs question answering based on storing items in a working memory while under control of a basal ganglia Comments This example is very much like the question answering with memory demo it would be good to read that However both the information to bind and the questions to answer come in through the same visual channel The basal ganglia decides what to do with the information in the visual channel based on its content i e whether it is a statement or a question Usage When you run the network it will start by binding RED and CIRCLE and then binding BLUE and SQUARE so the memory essentially has RED CIRCLE BLUE SQUARE It does this because it is told that RED CIRCLE is a STATEMENT i e RED CIRCLE STATEMENT in the code as is BLUE SQUARE Then it is presented with something like QUESTION RED i e What is red The basal ganglia then reroutes that input to be compared to what is in working memory and the result shows up in the motor channel Output See the screen capture below 2 4 Bas
53. GPU driver so the former can t be running while the latter is being changed updated 3 Hit Ctr1 Alt F1 This should take you to a login shell Enter your credentials and then cd into the directory where the driver installer was downloaded 4 To start the installer run the command sudo sh lt driver installer name gt 5 Answer yes to the queries from the installer especially the one that asks to change xorg conf 6 Once installation has finished restart the X server with sudo gdm service start Step 2 Install CUDA Toolkit 1 Be sure you have downloaded the CUDA toolkit installer for your system from the link provided above Note where the file gets downloaded 2 Run the installer with sudo sh lt toolkit installer name gt 3 The installer will ask where you want to install CUDA The default location usr local cuda is the most convenient since parts of the NengoGPU implementation assume it will be there however we can easily change these assumptions by changing some text files so just note where you install it 4 At the end the installer gives a message instructing you to set the values of certain environment variables Be sure to do this The best way to do it permanently is to set them in your bashrc file For example to change the PATH variable to include the path to the bin directory of the cuda installation add the following lines to the end of your bashrc PATH PATH lt path to cuda bin di
54. LE and then unbinding RED from that result and the output will be CIRCLE Then it does the same kind of thing with BLUE SQUARE You can set the input values by right clicking the SPA graphs and setting the value by typing somethign in If you type in a vocabulary word that is not there it will be added to the vocabulary Output See the screen capture below 2 4 Basal Ganglia Based Simulations 61 Nengo Documeniation Release 1 3 AAO Question Answering 1 00RED 2 0 0 5 1 00CIRCLE m 2 0 0 5 1 00RED 0 87CIRCLE 2 1 as a E 2 0 0 1 0 0 5 0 0000 kia Ae 0 8290 P Code D 16 subdim 4 N 100 seed 7 import nef import nef convolution import hrr import math import random random seed seed vocab hrr Vocabulary D max_similarity 0 1 net nef Network Question Answering Create the network object net make A 1 D mode direct Make some pseudo populations so they run well on less powerful machines 1 neuron 16 dimensions direct mode net make B 1 D mode direct net make_array C N D subdim dimensions subdim quick True radius 1 0 math sqrt D Make a real population with 100 neurons per array element and D subdim elements in the array feach with subdim dimensions set the radius as appropriate for multiplying things of this dimension net make E 1 D mode direct net make F 1 D mode direct convil nef con
55. Singular Value Decomposition and this process should be significantly faster especially for larger ensembles If the install failed Nengo cannot detect a CUDA enabled GPU or you simply chose not to install NengoUtilsGPU then the box will be disabled and an error message will appear to its right Note that the SVD implementation cannot take advantage of multiple GPUs which is why there is no option to select the number of GPUs for ensemble creation To change whether a GPU is used for ensemble creation from a python script use the line ca nengo math impl WeightedCostApproximator setUseGPU x where x is either TRUE or FALSE 4 6 2 Windows Coming soon 4 7 Integrating with IPython Notebook We are currently working on experimental support for Python http ipython org s new Notebook system The iPython Notebook gives a browser based interface to Python that is similar to the one in Mathematica However to make use of Python s graphing features it must be run from Python not Jython which is what Nengo uses To address this we can make slight modifications to Python to tell it to run Nengo scripts using Nengo but everything else in normal Python 4 7 1 Installing IPython e Download and install Python using the instructions at http ipython org download html e You should now be able to activate the notebook system by running ipython notebook pylab The pylab argument is optional but initializes the pylab
56. With the standard Nengo install creating neural population withs more than 500 neurons takes a very long time This is because Nengo needs to solve for the decoders which involves pseudo inverting an NxN matrix where N is the 116 Chapter 4 Advanced Nengo Usage Nengo Documentation Release 1 3 number of neurons By default Nengo is a pure Java application so in order to perform this matrix pseudo inversion it uses a Java implementation of Singular Value Decomposition which is quite slow To speed this up we can tell Nengo to call out to another program Python to perform this operation 4 5 1 Step 1 Install Python SciPy and NumPy Python is a freely available programming language and it has two extensions SciPy and NumPy which provide most of the high speed matrix operations found in Matlab For Windows and OS X download it from the following sites e Python http www python org getit e SciPy http sourceforge net projects scipy files scipy e NumPy http sourceforge net projects numpy files NumPy For Linux it may already be installed but a command like sudo apt get install python python numpy python scipy should install it on many standard Linux distributions such as Ubuntu Important Note When you install it use Python 2 7 or 2 6 rather than 3 2 As it says on the Python web page start with Python 2 7 more existing third party software is compatible with Python 2 than Python 3 right now 4
57. XON add projections net connect controlV getOrigin X convertXY getTermination refXY net connect refX getOrigin origin controlV getTermination inputRefx net connect refY getOrigin origin controlV getTermination inputRefY net connect convertAngles getOrigin currentX controlV getTermination inputCurrentX net connect convertAngles getOrigin currentY controlV getTermination inputCurrentyY net connect F getOrigin origin FX getTermination inputFs net connect convert XY getOrigin shoulderRef funcT getTermination shoulderRef net connect convertXY getOrigin elbowRef funcT getTermination elbowRef net connect Tfunc getOrigin origin funcT getTermination inputTs net connect funcT getOrigin funcTl controlU getTermination inputFuncTl net connect funcT getOrigin funcT2 controlU getTermination inputFuncT2 net connect FX getOrigin FX1 controlU getTermination inputFX1 net connect FX getOrigin FX2 controlU getTermination inputFX2 net add_to_nengo class Room space Room def init__ self space Room __init__ self 10 10 gravity 0 color Color OxFFFFFF Color OxFFFFFF Color OxEEEEEE Color 0OxDDDDDD Color OxCCCCCC Color OxBBBBBB def
58. _low rate_high gain bias for i in range count desired intercept x value for which the neuron starts firing intercept random uniform intercept_low intercept_high desired maximum rate firing rate when x is maximum rate random uniform rate_low rate_high this algorithm is specific to LIF neurons but should generate gain and bias values to produce the desired intercept and rate 128 Chapter 5 The NEF Algorithm Nengo Documentation Release 1 3 z 1 0 1 math exp t_ref 1 0 rate t_rc g 1 z intercept 1 0 b 1 gxintercept gain append g bias append b return gain bias random gain and bias for the two populations gain_A bias_A generate_gain_and_bias N_A 1 1 rate_A 0 rate_A 1 gain_B bias_B generate_gain_and_bias N_B 1 1 rate_B 0 rate_B 1 a simple leaky integrate and fire model scaled so that v 0 is resting voltage and v 1 is the firing threshold def run_neurons input v ref spikes for i in range len v dv dt input i v i t_re the LIF voltage change equation v i dv if v i lt 0O v i 0 don t allow voltage to go below 0 sR if ref i gt 0 if we are in our refractory period v i 0 keep voltage at zero and ref i dt decrease the refractory period 1E v a gt 1 if we have hit threshold spikes append True spike v i 0 reset the voltage ref i t_ref and set the refractory peri
59. a starting value of 45 net make neuron neurons 2 dimensions l1 max_rate 100 100 intercept 0 5 0 5 encoders 1 1 noise 3 Make 2 neurons representing 1 dimension with a maximum firing rate of 100 with a tuning curve x intercept of 1 096 1 596 i e the first responds more to positive values and the second to negative values 0 5 encoders of 1 and 1 and a noise of variance 3 net connect input neuron Connect the input to the neuron net add_to_nengo 2 1 3 Population of Neurons Purpose This demo shows how to construct and manipulate a population of neurons Comments These are 100 leaky integrate and fire LIF neurons The neuron tuning properties have been randomly selected Usage Grab the slider control and move it up and down to see the effects of increasing or decreasing input As a population these neurons do a good job of representing a single scalar value This can be seen by the fact that the input graph and neurons graphs match well Output See the screen capture below 2 1 Introductory Demos Nengo Documentation Release 1 3 Many Neurons input _ neurons input neurons 1 1 1 1 0 06 1 24 1 74 1 24 1 74 0 0000 ee 2 8440 Code import nef net nef Network Many Neurons Create the network net make_input input 0 45 Create a contro
60. a value within the ensemble s radius In most cases we specify this by giving a range of maximum firing rates and each neuron will have a maximum chosen uniformly from within this range This gives a somewhat biologically realistic amount of diversity in the tuning curves The following line makes neurons with maximums between 200Hz and 400Hz net make E 100 2 max_rate 200 400 Alternatively we can specify a particular set of maximum firing rates and each neuron will take on a value from the provided list If there are more neurons than elements in the list the provided values will be re used net make F 100 2 max_rate 200 250 300 350 400 Note The type of brackets used is important Python has two types of brackets for this sort of situation round brackets and square brackets Round brackets create a tuple which we use for indicating a range of values to randomly choose within and square brackets create a list which we use for specifying a list of particular value to use 3 3 3 Intercept The intercept is the point on the tuning curve graph where the neuron starts firing For example for a one dimensional ensemble a neuron with a preferred direction vector of 1 and an intercept of 0 3 will only fire when representing values above 0 3 If the preferred direction vector is 1 then it will only fire for values below 0 3 In general the neuron will only fire if the dot product of x the value
61. adius 1 5 net make B 50 1 net connect input getOrigin input A connect the input net connect B output getTermination save connect the output def multiply x return x 0 x 1 net connect A B func multiply net add_to_nengo 4 3 4 Running the Simulation The model so far should run successfully within Nengo using the standard approach of going into the interactive plots mode and clicking the run button However we can also have the model automatically run right from within the script This bypasses the visual display making it run faster The following commands runs the simulation for 2 seconds this is 2 simulated seconds of course the actual time needed to run the simulation is dependent on your computer s speed and the complexity of the network net run 2 0 The parameter indicates how long to run the simulation for and you can also optionally specify the time step default of dt 0 001 As an alternative method you can also run the simulation like this producing an equivalent result t 0 while t lt 2 0 net run 0 1 insert any code you want to run every 0 1 seconds here tt dt With either approach the simulation will be automatically run when you run the script With this approach you can even run a script without using the Nengo user interface at all Instead you can run the model from the command line Instead of running nengo or nengo bat on Wi
62. afe bet Install it with sudo apt get install gcc 4 4 If you have gcc 4 4 installed in some directory other than the default usr bin then you have to edit the GCC_PATH variable in the NengoGPU Makefile to point there 120 Chapter 4 Advanced Nengo Usage Nengo Documentation Release 1 3 5 Type make to compile the code in the directory If successful this creates a shared library called libNengoGPU so This is the native library that Nengo will call to perform the neural simulations on the GPU s 6 xxx Redo steps 1 3 in the NengoUtilsGPU directory which should be located in the same directory as the NengoGPU directory In this case there are two additional variables in the Makefile that you might have to edit which point to CULA libraries and include files CULA_INC_PATH and CULA_LIB_PATH Again you only have to edit these if you installed CULA in a location other than the default 7 We have make sure that the CUDA libraries which are referenced by libNengoGPU so can be found at runtime To acheive this cd into etc 1d so conf d Using your favourite text editor and ensuring you have root priveleges create a text file called cuda conf eg sudo vim cuda conf In this file type the lines lt absolute path to CUDA dir gt lib lt absolute path to CUDA dir gt 1ib64 So for example if you installed CUDA in the default location you should have usr local cuda lib usr local cuda 1ib64 x x If you ar
63. ake B 100 1 storage_code B Make a population with 100 neurons 1 dimensions net connect input A Connect the input to A net connect A B func lambda x x 0 x 0 Connect A and B with the defined function approximated in that connection net add_to_nengo 2 2 3 Addition Purpose This demo shows how to construct a network that adds two inputs Comments Essentially this is two communication channels into the same population Addition is thus somewhat free since the incoming currents from different synaptic connections interact linearly though two inputs don t have to combine in this way see the combining demo Usage Grab the slider controls and move them up and down to see the effects of increasing or decreasing input The C population represents the sum of A and B representations Note that the addition is a description of neural firing in the decoded space Neurons don t just add all the incoming spikes the NEF has determined appropriate connection weights to make the result in C interpretable i e decodable as the sum of A and B Output See the screen capture below 2 2 Simple Transformations 45 Nengo Documentation Release 1 3 0 80 i ce C Addition inputB _ gt _ B inputA _ gt __ A a 1 0 50 1 9 258 9 758 5 7570 qe 9 7570 Code import nef net nef Network Addition Create the netw
64. al Ganglia Based Simulations 67 Nengo Documentation Release 1 3 Question Answering with Control 0 39CIRCLE 1 memory i visual 0 884 ND motor channel gatel i gate1 i m ate PO i 9 StrD1 STN StrD2 _ va aL A thalamus GPi 4 GPe 0 0000 Al 0 8830 Code D 16 number of dimensions for representations N 100 number of neurons per dimension import nef import nps import nef convolution import hrr import math import random semantic pointers do not work well with small numbers of dimensions To keep this example model small enough to be easily run we have lowered the number of dimensions D to 16 and chosen a random number seed for which this model works well seed 17 Se Fe net nef Network Question Answering with Control seed seed Make a simple node to generate interesting input for the network random seed seed vocab hrr Vocabulary D max_similarity 0 1 class Input nef SimpleNode def init ___ self name self zero 0 D nef SimpleNode __init__ self name self RED_CIRCLE vocab parse STATEMENT RED CIRCLE v self BLUE_SQUARE vocab parse STATEMENT BLUE SQUARE v self RED vocab parse QUESTION RED v self SQUARE vocab parse QUESTION SQUARE v def origin_x self if 0 i lt sel f t lt 0 3 68 Chapter 2 Nengo Demos Neng
65. al parameters in the neuron model e plastic_array boolean configure the connection to be learnable See nef Network learn e create_projection boolean flag to disable actual creation of the connection If False any needed Origin and or Termination will be created and the return value will be the tuple origin termination rather than the created projection object Returns the created Projection or origin termination if create_projection is False learn post learn_term mod_term rate 4 999999999999999Se 07 kwargs Apply a learning rule to a termination of the post ensemble The mod_term termination will be used as an error signal that modulates the changing connection weights of learn_term Parameters e post string or Ensemble the ensemble whose termination will be changing or the name of this ensemble e learn_term string or Termination the termination whose transform will be modi fied by the learning rule This termination must be created with plastic_array True in nef Network connect e mod_term string or Termination the modulatory input to the learning rule this is optional It will be checked if learn_term is a ModulatedPlasticEnsembleTermination and not otherwise 106 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 e rate float the learning rate that will be used in the learning fuctions Todo Possible enhancement make this 2D for stdp mode di
66. b pylab pylab pylab pylab pylab pylab pylab pylab pylab figure plot x B title Tuning curves for population B figure plot times inputs label input plot times ideal label ideal plot times outputs label output title Simulation results legend show 132 Chapter 5 The NEF Algorithm CHAPTER SIX JAVA API DOCUMENTATION The Java API documentation is not a part of the User Manual but can be found here http www nengo ca javadoc e genindex e search 133
67. but inverting the second input net connect C C pstc 0 4 Recurrently connect C so it acts as a memory Create input to model CIRCLE vocab parse CIRCLE v Create a vocabulary BLUE vocab parse BLUE v ED vocab parse RED v ERO 0 D inputA inputA 0 RED inputA 0 25 BLUE inputA 0 5 ZERO R SQUARE vocab parse SQUARE v Z net make_input inputA inputA net connect inputA A inputB inputB 0 CIRCL inputB 0 25 SQUARE inputB 0 5 ZERO EJ net make_input inputB inputB net connect inputB B inputE inputE 0 ZERO inputE 0 5 CIRCLE inputE 0 6 RED inputE 0 7 SQUARE inputE 0 8 BLUE inputE 0 9 ZERO inputE 1 0 CIRCLE inputE 1 1 RED inputE 1 2 SQUARE inputE 1 3 BLUE inputE 1 4 ZERO inputE 1 5 CIRCLE inputE 1 6 RED inputE 1 7 SQUARE inputE 1 8 BLUE inputE 1 9 ZERO inputE 2 0 CIRCLE inputE 2 1 RED inputE 2 2 SQUARE inputE 2 3 BLUE inputE 2 4 ZERO inputE 2 5 CIRCLE inputE 2 6 RED inputE 2 7 SQUARE inputE 2 8 BLUE inputE 2 9 ZERO inputE 3 0 CIRCLE inputE 3 1 RED inputE 3 2 SQUARE 66 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 inputE 3 3 BLUE inputE 3 4 ZERO inputE 3 5 CIRCLE inputE 3 6 RE
68. by the mouse scroll wheel as desired 14 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 input gt _ s gt i ing a 3 9990 mn 7 9990 e Notice that the neural ensembles can be representing the same value but have a different firing pattern e Close the Interactive Plots when you are finished 1 2 5 Adding Scalars e If we want to add two values we can simply add another termination to the final ensemble and project to it as well e Create a termination on the second ensemble called input 2 e Create a new ensemble e Create a projection from the X origin to input 2 x e gt external input 3 Basic Network Network gt Basic Network Network e Create a new Function Input and set its constant value to 0 7 e Add the required termination and projection to connect it to the new ensemble 1 2 Linear Transformations 15 Nengo Documentation Release 1 3 File Edit View Options Help X Probe data 10 ernal in input Probe dat fy Basic Network Network gt Basic Network Network e Open Interactive Plots e Show the controls for the two inputs e Create value graphs for the three neural ensembles e Press Play to start the simulation The value for the final ensemble should be 0 5 0 7 0 2 e Use the control sliders to adjust the input The output should still be the sum of the inputs 1 a 1 E B input2 gt C Bh 0 0
69. cations Channel Create the network object net make_input input 0 5 Create a controllable input function with a starting value of 0 5 net make A 100 1 Make a population with 100 neurons 1 dimension net make B 100 1 Make a population with 100 neurons 1 dimension net connect input A Connect all the relevant objects net connect A B net add_to_nengo 2 2 2 Squaring the Input Purpose This demo shows how to construct a network that squares the value encoded in a first population in the output of a second population Comments This is a simple nonlinear function being decoded in the connection weights between the cells Previous demos are linear decodings Usage Grab the slider control and move it up and down to see the effects of increasing or decreasing input Notice that the output value does not go negative even for negative inputs Dragging the input slowly from 1 to 1 will approximately trace a quadratic curve in the output Output See the screen capture below 44 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 Squaring 2 2 0 99 ae Ee OAA ad 2 3 42 4 219 3 42 4 219 3 2130 es 7 2130 Code import nef net nef Network Squaring Create the network object net make_input input 0 Create a controllable input function with a starting value of 0 net make A 100 1 Make a population with 100 neurons 1 dimension net m
70. ck on the ensemble select Plot gt Constant Rate Responses Activities Firing Rate spikes s AVA a a MN LD 1 00 0 75 0 50 e tauRC affects the linearity of the neurons smaller values are more linear e Max rate affects the height of the curves at the left and right sides e Intercept affects where the curves hit the x axis i e the value where the neuron starts firing 1 1 5 Plotting Representation Error e We often want to determine the accuracy of a neural ensemble e Right click on the ensemble select Plot gt Plot Distortion X 6 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 Distortion 0 0225 0 0200 0 0175 0 0150 f 0 0125 0 0100 0 0075 4 0 0050 K 0 0025 f A a 0 0000 v A 7 0 0025 0 0050 0 0075 0 0100 0 0125 0 0150 0 0175 P 0 0200 Error 1 00 075 0 50 0 25 0 00 0 25 0 50 x Ideal Actual Error ps s awnsg e Mean Squared Error MSE is also shown at the top e MSE decreases as the square of the number of neurons so RMSE is proportional to 1 N e Can also affect representation accuracy by adjusting the range of intercepts This will cause the system to be more accurate in the middle of the range and less accurate at the edges 1 1 6 Adjusting an Ensemble e After an ensemble is created we can inspect and modify many of its parameters e Right click on an ensemble and select I
71. dit View Options Help Ba o e ad Basic Network Network Viewer Basic Network Network gt Basic Network Network 1 2 2 Creating Projections e We can now connect the two neural ensembles e Every ensemble automatically has an origin called X This is an origin suitable for building any linear transfor mation In Part Three we will show how to create origins for non linear transformations 1 2 Linear Transformations 9 Nengo Documentation Release 1 3 File Edit View Options Help Basic Network Network gt Basic Network Network e Click and drag from the origin to the termination This will create the desired projection File Edit View Options Help Basic Network Network gt Basic Network Network 1 2 3 Adding Inputs e In order to test that this projection works we need to set the value encoded by the first neural ensemble We do this by creating an input to the system This is how all external inputs to Nengo models are specified e Drag a Function Input icon from the sidebar into the network e Give it a name for example external input e Set its output dimensions to 1 10 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 Templates 3 v New Remove Name external input Functions Generators Output Dimensions 1 Set Functions Ok 1 Cancel e Press Set Function to define the behaviour of this input e Select Constant Function from the drop down list and
72. dividually with the results con catenated together For example the following code creates an array and then computes the sum of the squares of each value within it net nef Network Squaring Array input net make_input input 0 0 0 0 0 A net make_array A neurons 100 length 5 B net make B neurons 100 dimensions 1 net connect input A def square x return x 0 x 0 net connect A B transform 1 1 1 1 1 func square All of the parameters from nef Network make can also be used 102 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 If the storage_code parameter is used you may use d or variants such as 02d in the storage code which will be replaced by the index number of the ensemble in the array Thus storage_code a 02d will become a00 for the first ensemble a01 for the second a02 for the third and so on If the encoders parameter is used you can provide either the standard array of encoders e g 1 1 ora list of sets of encoders for each ensemble e g 1 1 If the dimensions parameter is used each ensemble represents the specified number of dimensions default value is dimension per ensemble For example if length 5 and dimensions 2 then a total of 10 dimen sions are represented by the network array The index number of the first ensemble s first dimension is 0 the index of the first ensemble s second dime
73. e system to choose between two different actions whether to go through the sequence or be driven by the visual input If the visual input has its value set to 0 8 START D for instance it will begin cycling through at D gt E etc The 0 8 scaling helps ensure start is unlikely to accidently match other SPAs which can be a problem in low dimensional examples like this one The utility graph shows the utility of each rule going into the basal ganglia The rule graph shows which one has been selected and is driving thalamus Usage When you run the network it will go through the sequence forever starting at D You can right click the SPA graph and set the value to anything else e g 0 8 START B and it will start at the new letter and then begin to sequence through The point is partly that it can ignore the input after its first been shown and doesn t perseverate on the letter as it would without gating Output See the screen capture below Routing 0 63B 0 60A ON RS input vision gt _ BG 0 0 5 Utility Rules 0 028 1 A C 0 00 C 1 0 0 5 D 0 01 D E 0 08 E start 0 00 start 0 0000 J j gt 0 5370 60 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 Code from spa import x D 16 class Rules Define the rules by specifying the start state and the desired next state def start vision START set state vision
74. e 0 42a e J 2 0 465 0 965 0 465 0 965 0 0000 D 1 0580 e To deal with high dimensional vectors we don t want to have to set each individual value for each vector would need 100 controls to configure a single 100 dimensional vector e Nengo Interactive Plots has a specialized semantic pointer graph for these high dimensional cases Instead of showing the value of each element in the vector as with a normal graph it shows the similarity between the currently represented vector and all the known vectors How much like CAT is this How much like DOG How much like RED How much like TRIANGLE 1 5 Cognitive Models 31 Nengo Documentation Release 1 3 You can configure which comparisons are shown using the right click menu You can also use it to _set_ the contents of a neural group by right clicking and choosing set value This will force the neurons to represent the given semantic pointer You can go back to normal behaviour by selecting release value e Use the right click menu to set the input values to a and b The output should be similar to a b This shows that the network is capable of computing the circular convolution operation which binds two semantic pointers to create a third one e Use the right click menu to set the input values to a and a b The output should be similar to b Note is an approximate inverse
75. e Create two function inputs called input and control Start with Constant functions with a value of 1 Use the script console to set the input function by clicking on it and entering the same input function as used above that functions ca nengo math impl PiecewiseConstantFunction 0 2 0 3 0 44 0 54 0 8 0 9 0 e Connect the input function to the input termination the control function to the control termination and the product origin to the feedback termination 28 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 File Edit View Options Help 1G ntrolledintegrator Network Viewer f x input eo w i ae control o f x Integrator origin k control Controlledintegrator Network gt Controlledintegrator Network e Go to Interactive Plots and show a graph for the value of the ensemble right click gt X gt value If you run the simulation this graph will show the values of both variables stored in this ensemble the integrated value and the control signal For clarity turn off the display of the cotrol signal by right clicking on the graph and removing the checkmark beside v 1 e The performance of this model should be similar to that of the non controlled integrator control reef gt input 10 input 10 0 0000 y 1 0620 e Now adjust the control input to be 0 3 instead of 1 This will make the integrator into a leaky int
76. e G inputB gt B A y 1 1 a a eee n 0 37 1 0 0000 p 3 7550 Code import nef net nef Network Combining Create the network object net make_input input A 0 Create a controllable input function with a starting value of 0 net make_input input B 0 Create another controllable input function with a starting value of 0 net make A 100 1 Make a population with 100 neurons 1 dimension net make B 100 1 Make a population with 100 neurons 1 dimension net make C 100 2 radius 1 5 Make a population with 100 neurons 2 dimensions and set a larger radius so 1 1 input still fits within the circle of that radius net connect input A A Connect all the relevant objects default connection is identity net connect input B B net connect A C transform 1 0 Connect with the given 1x2D mapping matrix net connect B C transform 0 1 net add_to_nengo 2 2 5 Performing Multiplication Purpose This demo shows how to construct a network that multiplies two inputs Comments This can be thought of as a combination of the combining demo and the squaring demo Essentially we project both inputs independently into a 2D space and then decode a nonlinear transformation of that space the product of the first and second vector elements 2 2 Simple Transformations 47 Nengo Documentation Release 1 3
77. e installing NengoUtilsGPU as well then you also have to add the lines lt absolute path to CULA dir gt lib lt absolute path to CULA dir gt 1ib64 Save the file and exit Finally run sudo ldconfig This populates the file etc ld so cache using the files in etc 1ld so conf d 1d so cache tells the machine were to look for shared libraries at runtime in addition to the default locations like usr lib and usr local 1lib 8 This step is only for developers running Nengo through an IDE like Eclipse Those using a prepackaged version of Nengo can skip this step The Java Virtual Machine has to be told where to look for native libraries Edit the JVM arguments in your Run and Debug configurations so that they contains the following text Djava library path lt absolute path to NengoGPU dir gt xxx If you are also installing NengoUtilsGPU then you must also add lt absolute path to NengoUtilsGPU dir gt using a colon as the separator between paths Step 6 Using NengoGPU and NengoUtilsGPU 1 NengoGPU provides support for running certain NEF Ensembles on the CPU while the rest are simulated on the GPU s Right click on the NEF Ensembles that you want to stay on the CPU and select the configure option Set the useGPU field to false and the ensemble you are configuring will run on the CPU no matter what You can also edit the same field on a Network object and it will force all NEF Ensembles within the Network to
78. e is no error though there is an actual error The difference here is that error is calculated by a neural population and used by the learning rule while actual error is computed mathematically and is just for information Repeat the experiment After a few simulated seconds the post and pre will match well You can hit the reset button bottom left and the weights will be reset to their original random values and the switch will go to zero For a different random starting point you need to re run the script Output See the screen capture below Learn Communication switch error actual error 1 1 switch _ gt Gate v a aa N error 1 05 e 1 input 9 019 9 519 9 019 9 519 input gt pre gt post input pre post 1 1 1 post pre_00 1 1 4 9 019 9 519 9 019 9 519 9 019 9 519 5 5180 eT 9 5180 Code N 60 D 1 import nef import nef templates learned_termination as learning import nef templates gate as gating import random random seed 27 net nef Network Learn Communication Create the network object Create input and output populations 70 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 net make pre N D Make a population with 60 neurons 1 dimensions net make post N D Make a population with 60 neurons 1 dimensions Create a random function input net make_fourier_input input base 0 1 high 10
79. ections coming into it Select Inspector Double click on transform e Double click on the 1 0 and change it to 2 0 O DecodedTermination i dimensions int dynamics LinearSystem A initialState float o input RealOutput Tif modulatory boolean amp abc name String gt node Node X output float O scaling DecodedTermination X staticBias float X tau float A transform float a 1 a Set Rows Set Columns OK Done e Click Save Changes e Now run the simulation The final result should be 2 0 5 0 7 0 3 1 2 7 Multiple Dimensions e Everything discussed above also applies to ensembles that represent more than one dimension e To create these set the number of dimensions to 2 when creating the ensemble Templates last used z New Remove Name D Number of Nodes 200 Dimensions A Node Factory LIF Neuron v Set Radius 1 0 Ok Cancel Advanced e When adding a termination the input dimension can be adjusted This defines the shape of the transformation matrix for the termination allowing for projections that change the dimension of the data 1 2 Linear Transformations 17 Nengo Documentation Release 1 3 Templates last_used gt New Remove Name input Weights Input Dim b Set Weights tauPSC
80. edTermination controlV addDecodedTermination inputCurrentX 1 0 pstc False inputCurrentyY 0 1 pstc False inputRefX 1 0 pstc False inputRefY 0 1 pstc False controlU addDecodedTermination 0 pstc False controlU addDecodedTermination controlU addDecodedTermination controlU addDecodedTermination inputFX2 inputFuncTl 1 inputFuncT2 0 1 pstc False inputFX1 1 0 pstc False 0 1 pstc False add origins interpreter DefaultFunctionInterpreter convertXY addDecodedOrigin elbowRef getElbow AXON convertXY addDecodedOrigin shoulderRef getShoulder AXON convertAngles addDecodedOrigin currentX getX AXON convertAngles addDecodedOrigin currentY getY AXON FX addDecodedOrigin FX1 interpreter parse x0 x8 x1 x9 x2 x10 x3 x11 12 AXON 2 6 Miscellaneous 81 Nengo Documentation Release 1 3 FX addDecodedOrigin FX2 interpreter parse x4 x8 x5 x9 x6 x10 x7 x11 12 AXON funcT addDecodedOrigin funcTl interpreter parse x0 x2 x1 x3 6 AXON funcT addDecodedOrigin funcT2 interpreter parse x0 x4 x1 x5 6 AXON controlU addDecodedOrigin ul interpreter parse x0 2 AXON controlU addDecodedOrigin u2 interpreter parse x1 2 A
81. een the cortex and the basal ganglia The THEN portion of a rule says what semantic pointers to send to what areas of the brain This is again a linear operation that can be computed on the output of the thalamus using the output from the basal ganglia e Simple example Five possible states A B C D and E Rules for IF A THEN B IF B THEN C IF C THEN D IF D THEN E IF E THEN A Five production rules semantic pointer mappings cycling through the five states open demo sequence py npani StrD2 buffer memory 0 60C 0 60A B 0 33B buffer memo eg 1 f Os GPe thal_memory No 1 1 0 145 0 645 0 145 0 645 0 0000 a 0 6440 e Can set the contents of working memory in Interactive Plots by opening an SP graph right clicking on it and choosing set value use release value to allow the model to change the contents e Cycle time is around 40ms slightly faster than the standard 50ms value used in ACT R Soar EPIC etc This depends on the time constant for the neurotransmitter GABA 1 5 Cognitive Models 35 Nengo Documentation Release 1 3 biologically consistent values 2 4 6 8 10 12 14 16 18 20 inhibitory time constant ms 1 5 5 Routing of Information e What about more complex actions Same model as above be we want visual input to be able to control where we start the sequence Simple approach add a visual buffer and connect it to the working memory o
82. egrator This value adjusts how quickly the integrator forgets over time control O A an eo a o 1 input input 10 0 0000 y 1 0520 1 4 Feedback and Dynamics 29 Nengo Documentation Release 1 3 e You can also run this example using demo controlledintegrator py 1 5 Cognitive Models 1 5 1 Larger Systems e So far we ve seen how to implement the various basic components representations linear transformation non linear transformation feedback e It is possible to use these components to build full cognitive models using spiking neurons see http nengo ca build a brain Constrained by the actual properties of real neurons in real brains numbers of neurons connectivity neurotransmitters etc That should be able to produce behavioural predictions in terms of timing accuracy lesion effects drug treatments etc e Some simple examples Motor control x take an existing engineering control model for what angles to move joints to to place the hand at a particular position open demo armcontrol py T matrix ie 7 convert Angles T ana signal u convert A am sna v JT i X Fi refY Room gt FX A 0 0 5 0 0000 OE 0 9260 Braitenberg vehicle connect range sensors to opposite motors on a wheeled robot open demo vehicle py 30 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3
83. ent one in the Notebook and enter the following import stats data stats Reader plot data time data input plot data time data A 4 7 6 Running a model with different parameter settings e First we need to identify the parameters we might want to vary in the Nengo model Do this by defining them as variables at the top of the code For example here are my parameters N 50 import nef net nef Network Test net make A N 1 net make_input input 0 1 0 2 0 5 0 4 0 0 6 0 5 0 8 1 0 net connect input A log net log log add input tau 0 log add A net run 1 0 The parameters should be defined before any import statements but can be after any comments at the top of the file e Re run the model Ctrl Enter e Make a new cell with the following code import stats stats run experiment Test 3 N 5 10 20 50 The model will be run 3 times at each parameter setting N 5 N 10 N 20 and N 50 for a total of 12 runs Each parameter combination has its own directory inside the experiment Test directory 4 7 7 Plotting data from varying parameter settings e We use the stats Stats class to access the data from simulation runs For example to get the data from the runs where N 5 we can do import stats s stats Stats experiment Test data s data N 5 for i in range len data plot data time i data A i 124 Chapter
84. esting to rerun the simulation from the start but decreasing the control input before the automatic input starts to see the effects of leaky integration Output See the screen capture below Controlled Integrator control E control TOE 4 0 26 0 137 1 137 1 0 137 1 137 10 input 0 00 10 0 137 1 137 0 0000 1 1360 Code 2 3 Dynamics 53 Nengo Documentation Release 1 3 import nef This implements the controlled integrator described in the book How to build a brain net nef Network Controlled Integrator 2 Create the network object net make_input input 0 2 5 0 3 0 0 44 10 0 5470 O 825 0 920 J Create a controllable input function with a default function that goes to 5 at time 0 2s at time 0 3s and so on net make_input control 0 Create a controllable input function with a starting value of 0 net make A 225 2 radius 1 5 Make a population with 225 neurons 2 dimensions anda larger radius to accommodate large simulataneous inputs net connect input A transform 0 1 0 pstc 0 1 Connect all the relevant fobjects with the relevant 1x2 mappings postsynaptic time constant is 10ms net connect control A transform 0 1 pstc 0 1 def feedback x return x 0 x 1 x 0 Different than the other controlled integrator net connect A A transform 1 0 func feedback pstc 0 1 Create the recurrent
85. fX input if tx is not None wt origin x tx values 0 0 7 else wt origin x 0 7 ty controlV getTermination inputRefY input if ty is not None wt origin y ty values 0 0 1 else wt origin y 0 1 wt origin z 2 ms self target physics motionState ms worldTransform wt self target physics motionState ms self vell vl z self vel2 v2 z yield 0 0001 r ccm nengo create Room net add r need to make hingel hinge2 vell and vel external nodes and hook up the output to the FX matrix r exposeOrigin r getNode hingel getOrigin origin shoulderAngle r exposeOrigin r getNode hinge2 getOrigin origin elbowAngle r exposeOrigin r getNode vell getOrigin origin shoulderVel r exposeOrigin r getNode vel2 getOrigin origin elbowVel net connect r getOrigin shoulderAngle FX getTermination X1 net connect r getOrigin elbowAngle FX getTermination X2 net connect r getOrigin shoulderVel FX getTermination X3 net connect r getOrigin elbowVel FX getTermination X4 net connect r getOrigin shoulderAngle convertAngles getTermination shoulder net connect r getOrigin elbowAngle convertAngles getTermination elbow net connect r getOrigin shoulderAngle funcT getTermination
86. fferent rates for in_fcn and out_fcn If stdp is True a triplet based spike timinng dependent plasticity rule is used based on that defined in Pfister J and Gerstner W 2006 Triplets of Spikes in a Model of Spike Timing Dependent Plasticity J Neurosci 26 9673 9682 The parameters for this learning rule have the following defaults and can be set as keyword arguments to this function call a2Minus 5 0e 3 a3Minus 5 0e 3 tauMinus 70 tauxX 70 a2Plus 5 0e 3 a3Plus 5 0e 3 tauPlus 70 tauY 70 decay None homeostatis None If stdp is False a rate mode error minimizing learning rule is applied The only parameter available here is whether or not to include an oja normalization term oja True learn_array array learn_term mod_term rate 4 999999999999999Se 07 kwargs Apply a learning rule to a termination of aca nengo model impl NetworkArrayImp1 an array of ensembles created using nef Network make_array See nef Network learn for parameters add_to_nengo Add the network to the Nengo user interface If there is no user interface i e if Nengo is being run via the command line only interface nengo c1 then do nothing add_to world None Add the network to the given Nengo world object If there is a network with that name already there remove the old one If world is None it will attempt to find a running version of Nengo to add to Deprecated since version 1 3 Use nef Network add_to_ne
87. graphic system so it is recommended A browser window will appear with the notebook interface 4 7 2 Configuring IPython Notebook e We need to customize Python a bit to tell it about Nengo Create a configuration profile with the command ipython profile create e Open the newly created file profile_default ipython_config py It should be off your new Python directory On Linux this is usually config ipython For other operating systems see http Apython org ipython doc stable config overview html configuration file location e Add the following lines to the bottom of the file 122 Chapter 4 Advanced Nengo Usage Nengo Documentation Release 1 3 import sys sys path append python This will give Python access to Nengo s python directory This will let you import things from that directory most importantly the st ats module will be useful for running Nengo models and extracting data from the logs files 4 7 3 Telling IPython to use Nengo e We now need to make a small change to the Python interface We do this by editting one of the javascript files it uses to create its interface e First find your python install s site packages directory This varies by computer but you can get a list by doing import sys for f in sys path if f endswith packages print f e Find the Python directory IPython frontend html notebook static js e Edit notebook js Find the line containing Notebook prototype execu
88. h that name already exists the existing origin will be used rather than creating a new one func can also be an matrix of values of the same length as the eval_points provided to the make function for the pre population This uses the data as a lookup table For example we can compute XOR using the following net make A 50 2 eval_points 1 1 1 1 1 1 1 1 net make B 50 1 net connect A B func 1 1 1 1 If weight_func is not None the connection will be made using a synaptic connection weight matrix rather than a DecodedOrigin and a Decoded Termination The computed weight matrix will be passed to the provided function which is then free to modify any values in that matrix returning a new one that will actually be used This allows for direct control over the connection weights rather than just using the once computed via the NEF methods If you do not want to modify these weights but do want Nengo to compute the weight matrix you can just set expose_weights to True Parameters e pre The item to connect from Can be a string the name of the en semble an Ensemble made via nef Network make an array of En sembles made via nef Network make_array a FunctionInput made via nef Network make_input or an Origin e post The item to connect to Can be a string the name of the ensemble an Ensemble made via nef Network make an array of Ensembles made via
89. he code simulating the LIF neurons but each neuron has different parameters and input On the other hand a SimpleNode that you have defined yourself in a Python script cannot run on the GPU because SimpleNodes can contain arbitrary code The GPU can also be used to speed up the process of creating NEF Ensembles The dominant component in creating an NEF Ensemble in terms of runtime is solving for the decoders which requires performing a Singular Value Decom position SVD on a matrix with dimensions number of neurons x number of neurons If the number of neurons in the ensemble is large this can be an extremely computationally expensive operation You can use CUDA to perform this SVD operation much faster The Nengo GPU implementation requires the CUDA developer driver and runtime libraries for your operating system You may also want to install the CUDA code samples which let you test whether your machine can access and communicate with the GPU and whether the GPU is performing up to standards Installers for each of these can be downloaded from here http developer nvidia com cuda toolkit 40 As a first step you should download a copy of each of these The SVD computation requires a third party GPU linear algebra toolkit called CULA Dense in addition to the CUDA toolkit CULA Dense can be downloaded free of charge though does require a quick registration from here http culatools com downloads dense Download version R13a as it is the
90. icking on the ensemble and selecting X gt value This is the standard value graph that just shows the value being represented by this ensemble Press Play to run the simulation With the default input of 0 5 the squared value should be near 0 25 Use the control to adjust the input The output should be the square of the input input gt A A X A square TA 0 83 1 1 3 083 3 583 3 083 3 583 2 5150 E iy 6 5150 e You can also run this example by opening demo squaring py 1 3 2 Functions of multiple variables e Since X the value being represented by an ensemble can also be multidimensional we can also calculate these sorts of functions f a To z1 f x maz zxo x e To begin we create two ensembles and two function inputs These will represent the two values we wish to multiply together The ensembles should be one dimensional use 100 neurons and have a radius of 10 so they can represent values between 10 and 10 20 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 The two function inputs should be constants set to 8 and 5 The terminations you create to connect them should have time constants of 0 01 AMPA File Edit View Options Help N onlinear Network Viewer 100 nodes fx e e inputA 100 nodes 5 e faee inputB B Nonlinear Network gt Nonlinear Network e Now create a two dimensional neural ensemble with a radius
91. ilding NEF models is Nengo http nengo ca Nengo is a cross platform Java application that provides both a drag and drop graphical user environment and a Python scripting interface for creating these neural models It has been used to model a wide variety of behaviour including motor control visual attention serial recall action selection working memory attractor networks inductive reasoning path integration and planning with problem solving However given the complexity of Nengo and due to the fact that this is a fairly non standard approach to neural modelling we feel it is also useful to have a simple example that shows exactly how the NEF works from beginning to end That is the goal of this script This script shows how to build a simple feed forward network of leaky integrate and fire neurons where each population encodes a one dimensional value and the connection weights between the populations are optimized to compute some arbitrary function This same approach is used in Nengo extended to multi dimensional representation multiple populations of neurons and recurrent connections To change the input to the system change input To change the function computed by the weights change function Je e e J i k e J i e k G e e Y e G G G e G e H H 127 Nengo Documeniation Release 1 3 The size of the populations and their neural properties can also be adjusted by changing the parameters below This
92. ing functions also exist argmax argsort argmin asarray bitwise_not choose clip compress concatenate fromfunction indices nonzero searchsorted sort take where tostring fromstring trace repeat diagonal sum cumsum product cumproduct alitrue sometrue Se SR SR OR The vast majority of the time you can use these objects the same way you would a normal list of values i e for specifying transformation matrices If you ever need to explicitly convert one back into a list you can call tolist a array 1 2 3 b a tolist These functions are all available at the Nengo console and in any script called using the run command To access them in a separate script file you need to call from numeric import x 100 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 3 8 List of Classes 3 8 1 nef Network class nef Network name quick None seed None fixed_seed None Wraps a Nengo network with a set of helper functions for simplifying the creation of Nengo models This system is meant to allow short concise code to create Nengo models For example we can make a communication channel like this import nef net nef Network Test Network input net make_input input values 0 A net make A neurons 100 dimensions 1 B net make B neurons 100 dimensions 1 net connect input A net connect A B net add_to_nengo This will automatically create the necessary
93. inner take all x But you have to wait for the network to settle so it can be rather slow Gurney Prescott amp Redgrave 2001 In Nengo there is a drag and drop template for this BG model x Model of action selection constrained by the connectivity of the basal ganglia 32 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 striatum D2 cortex GPi SNr striatum D1 o e r E Each component computes the following function 1 0 0 8 0 6 0 4 0 2 ET 0 5 0 0 0 5 1 0 Their model uses unrealistic rate neurons with that function for an output We can use populations of spiking neurons and compute that function We can also use correct timing values for the neurotransmitters involved open demo basalganglia py e Adjust the input controls to change the five utility values being selected between e Graph shows the output from the basal ganglia each line shows a different action e The selected action is the one set to zero input f my N p E StrD1 STN StrD2 w GPe 0 00 0 06 0 76 0 50 0 16 2 GPi output 2 17 89 18 39 14 5290 c 18 5290 e Comparison to neural data Ryan amp Clark 1991 1 5 Cognitive Models 33 Nengo Documentation Release 1 3 Stimulate regions in medial orbitofrontal cortex measure from GPi see how long it takes for a response to occur Stimulus e To replicate Set the
94. it needs to invert an NxN matrix By deault an ensemble consists of neurons with randomly generated encoders intercepts and maximum firing rates withing the specified ranges This means that a new decoder must be computed for every new ensemble However if we specify a random number seed all of these parameters will be consistent that is if we run the script again and re create the same ensemble with the same random number seed Nengo will detect this and re use the previously computed decoder This greatly speeds up running the script Of course the first time the script is run it will still take the same amount of time There are two ways to specify the random number seed The first is to set the fixed_seed parameter when creating the Network net Network My Network fixed_seed 5 This tells Nengo to use the random seed 5 for every single ensemble that is created within this network In other words if you now do the following net make A neurons 300 dimensions 1 net make B neurons 300 dimensions 1 then the neurons within ensembles A and B will be identical This will also be true for ensembles within a NetworkAr ray net make_array C 300 10 which will create an array of ten identical ensembles each with 300 identical neurons 92 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 3 4 1 Avoiding Identical Ensembles Using fixed_seed allows you to make network
95. itial state log name None dir None _filename name s time s csv _ interval 0 001 tau 0 01 auto_flush True Creates a nef Log object which dumps data to a csv file as the model runs See the nef Log documentation for details Parameters e name string The name of the model Defaults to the name of the Network object e dir string The directory to place the csv file into e filename string The filename to use csv will be added if it is not already there Can use name s to refer to the name of the model and time s to refer to the start time of the model Defaults to name s time s csv interval float The time interval between each row of the log Defaults to 0 001 1ms e tau float The default filter time for data Defaults to 0 01 10ms set_view_function_1d node basis label I1D function minx 1 maxx 1 miny 1 maxy 1 Define a function representation for the given node This has no effect on the model itself but provides a useful display in the interactive plots visualizer The vector represented by the function is plotted by treating the vector values as weights for a set of basis functions So if a vector is 2 0 3 and the basis functions are x x and 1 we get the polynomial Q a 3 The provided basis function should accept two parameters and index value indicating which basis function should be computed and x indicating the x value to compu
96. k on the input and select control This lets us vary the input while the simulation is running e Drag the slider up and down while the simulation is running press Play again if it is paused The neurons in ensemble A should be able to successfully represent the changing values input gt A gt B 0 0000 t Q 1 7920 e We can also view what the individual neurons are doing during the simulation Right click on A and choose spike raster This shows the individual spikes coming from the neurons Since there are 100 neurons in ensemble A the spikes from only a sub set of these are shown You can right click on the spike raster graph and adjust the proportion of spikes shown Change it to 50 e Run the simulation and change the input This will affect the neuron firing patterns 1 2 Linear Transformations 13 Nengo Documentation Release 1 3 input gt A gt B 4 533 5 033 3 9990 c f 7 9990 e We can also see the voltage levels of all the individual neurons Right click on A and choose voltage grid Each neuron is shown as a square and the shading of that square indicates the voltage of that neuron s cell membrane from black resting potential to white firing threshold Yellow indicates a spike e The neurons are initially randomly ordered You can change this by right clicking on the voltage grid and selecting improve layout This will attempt to re order the neurons so that
97. king on it and selecting hide To show a hidden component right click on the background and select the component by name e The bottom of the window shows the controls for running the simulation The simulation can be started and stopped by pressing the Play or Pause button at the bottom right Doing this right now will run the simulation but no data will be displayed since we don t have any graphs open yet The reset button on the far left clears all the data from the simulation and puts it back to the beginning 12 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 In the middle is a slider that shows the current time in the simulation Once a simulation has been run we can slide this back and forth to observe data from different times in the simulation e Right clicking on a component also allows us to select a type of data to show about that component Right click on A and select value This creates a graph that shows the value being represented by the neuron in ensemble A You can move the graph by left click dragging it and you can resize it by dragging near the corners or using a mouse scroll wheel Press the Play button at the bottom right or the window and confirm that this group of neurons successfully represents its input value which we previously set to be 0 5 input gt A gt B 0 0000 m 0 3840 e Now let us see what happens if we change the input Right clic
98. lamus png Next we define the parameters that should be set for the component These can be strings str integers int real numbers float or checkboxes bool For each one we must indicate the name of the parameter the label text the type and the help text params name Name str Name of thalamus neurons Neurons per dimension int Number of neurons to use D Dimensions int Number of actions the thalamus can represent 7useQuick Quick mode bool Tf true the same distribution of neurons will be used for each action Next we need a function that will test if the parameters are valid This function will be given the parameters as a dictionary and should return a string containing the error message if there is an error or not return anything if there is no error def test_params net p try net network getNode p name return That name is already taken except pass Finally we define the function that actually makes the component This function will be passed in a nef Network object that corresponds to the network we have dragged the template into along with all of the parameters specified in the params list above This script can now do any scripting calculations desired to build the model def make net name Network Array neurons 50 D 2 useQuick True thal net make_array name neurons D max_rate 100 300
99. latest release that is compatible with version 4 0 of the CUDA toolkit CULA is NOT required if you only want to run simulations on the GPU Note Most of the following steps have to be performed for both NengoGPJU the library for running Nengo simulations on the GPU and NengoUtilsGPU the library for performing the SVD on the GPU Here I will detail the process of installing NengoGPU but the process of installing NengoUt ilsGPU is almost identical I will add comments about how to adjust the process for installing NengoUt ilsGPU and will mark these with The main difference is that NengoUtilsGPU relies on CULA whereas NengoGPU has no such dependency and this manifests itself in several places throughout the process 4 6 1 Linux Step 1 Install CUDA Developer Driver 1 Be sure you have downloaded the CUDA developer driver installer for your system from the link provided above Note where the file gets downloaded 2 Ina shell enter the command sudo gdm service stop 118 Chapter 4 Advanced Nengo Usage Ensemble ca nengo util Memory Used 108627944 Total 164933632 Max 81094246 Ensemble ca nengo util Memory Used 108627944 Total 164933632 Max 81094246 The system lt Ensemble ca nengo math impl WeightedCostApproximator Using 53 singular value Ensemble ca nengo util Memory Used 75885904 Total 164933632 Max 810942464 Nengo Documentation Release 1 3 This stops the X server The X server relies on the
100. le using integrator py in the demo directory 1 4 5 Controlled Integrator e We can also build an integrator where the feedback transformation 1 in the previous model can be controlled This allows us to build a tunable filter e This requires the use of multiplication since we need to multiply two stored values together This was covered in the previous part of the tutorial e We can efficiently implement this by using a two dimensional ensemble One dimension will hold the value being represented and the other dimension will hold the transformation weight e Create a two dimensional neural ensemble with 225 neurons and a radius of 1 5 e Create the following three terminations input time constant of 0 1 1 dimensional with a transformation matrix of 0 1 0 This acts the same as the input in the previous model but only affects the first dimension control time constant of 0 1 1 dimensional with a transformation matrix of 0 1 This stores the input control signal into the second dimension of the ensemble feedback time constant of 0 1 1 dimensional with a transformation matrix of 1 0 This will be used in the same manner as the feedback termination in the previous model e Create a new origin that multiplies the values in the vector together This is exactly the same as the multiplier in the previous part of this tutorial This is a 1 dimensional output with a User defined Function of x0 x1
101. line gives the size and location of the window the last line gives the setting of the various simulation param eters and the middle lines define the various plots that are displayed While this saved file format is human readable it is not meant to be hand coded The best way to create a layout is to open up the Interactive plots window create the layout you want and save it A corresponding file will be created in the Layouts folder 4 1 1 Specifying a Layout If you want to you can define a layout in a script by cutting and pasting from a Layout file For this you can use the nef Network set_layout function For example here we define a simple network and directly specify the layout to use import nef net nef Network My Test Model net make_input input 0 0 net make neurons 100 2 quick True net connect input neurons net add_to_nengo net set_layout height 473 x 983 width 798 state 0 Tyne 85 neurons None x 373 height 32 label 0 width 79 yi 76 Yanput None x 53 height 32 label 0 width 51 y s Tok neurons voltage grid x 489 height 104 auto_improve 0 label 0 width 104 rows None y 30 neurons value X x 601 height 105 sel_dim 0 1 label 0 width 158 aut
102. llable input with a starting value of 45 net make neurons neurons 100 Make a population of 100 neurons dimensions 1 noise 1 representing 1 dimensions with random injected input noise of variance 1 net connect input neurons Connect the input to the neuron net add_to_nengo 2 1 4 2D Representation Purpose This demo shows how to construct and manipulate a population of 2D neurons Comments These are 100 leaky integrate and fire LIF neurons The neuron tuning properties have been randomly selected to encode a 2D space i e each neuron has an encoder randomly selected from the unit circle Usage Grab the slider controls and move then up and down to see the effects of shifting the input throughout the 2D space As a population these neurons do a good job of representing a 2D vector value This can be seen by the fact that the input graph and neurons graphs match well Output See the screen capture below The circle plot is showing the preferred direction vector of each neuron multplied by its firing rate This kind of plot was made famous by Georgoupolos et al 42 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 2D Representation 1 input neurons 1 1 709 2 209 Input neurons 1 1 O Y 1 1 1 1 0 16 0 25 1 1 0 0000 kK ee mmm yyw 3 0940 Code import nef net nef Network 2D Representation Create the network net make_input input
103. mbrane time constant e tau_ref float refractory period e max_rate tuple or list range for uniform selection of maximum firing rate in Hz as a 2 tuple or a list of maximum rate values to use e intercept tuple or list normalized range for uniform selection of tuning curve x intercept as 2 tuple or a list of intercept values to use intercepts are defined with respect to encoders so an encoder of 1 and intercept of 3 will result in a neuron only active below 3 3 8 List of Classes 101 Nengo Documentation Release 1 3 e radius float representational range e encoders list list of encoder vectors to use if None uniform distribution around unit sphere The provided encoders will be automatically normalized to unit length e decoder_noise float amount of noise to assume when calculating decoders e eval_points list list of points to do optimization over e noise float current noise to inject chosen uniformly from noise noise e noise_frequency float sampling rate how quickly the noise changes e mode string simulation mode direct rate or spike e node_factory ca nengo model impl NodeFactory a factory to use instead of the default LIF factory for creating ensembles with neurons other than LIF e decoder_sign None 1 or 1 1 for positive decoders 1 for negative decoders Set to None to allow both e quick boolean or None
104. n capture below 58 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 Sequence Utility Rules buffer 0 83A 0 33E 1 1 0 745 1 245 input gt state ________ thal 0 0000 1 2440 Code from spa import D 16 class Rules Define the rules by specifying the start state and the desired next state def A state A e g If in state A set state B then go to state B def B state B set state C def C state C set state D def D state D set state E def E state E set state A class Sequence SPA Define an SPA model cortex basal ganglia thalamus dimensions 16 2 4 Basal Ganglia Based Simulations 59 Nengo Documentation Release 1 3 state Buffer Create a working memory recurrent network object i e a Buffer BG BasalGanglia Rules Create a basal ganglia with the prespecified set of rules thal Thalamus BG Create a thalamus for that basal ganglia so it uses the same rules input Input 0 1 state D Define an input set the input to state D for 100 ms seq Sequence 2 4 3 Routed Sequencing Purpose This demo uses the basal ganglia model to cycle through a 5 element sequence where an arbitrary start can be presented to the model Comments This basal ganglia is now hooked up to a memory and includes routing The addition of routing allows th
105. n the 2D output 54 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 Oscillator input gt A input input A 1 O O 1 1 i 1 382 y 0 682 0 882 1 382 A A i AUOUUOOOOUOOLONNOULINI 1 TL T PT a 0 00 0 00 Ul TT I PT TEE Aaaama TT TT MONON TOE DOO iT UT Hl TT TUTTI UT UG a 1 TT HINT LA TEE AODODOOO NNN EEUU ULL ANAEMIA M TIMED TTT 1 0 882 1 382 0 3810 om 1 3810 Code import nef net nef Network Oscillator Create the network object net make_input input 1 0 zero_after_time 0 1 Create a controllable input function with a starting value of 1 and zero then make it go to zero after 1s net make A 200 2 Make a population with 200 neurons 2 dimensions net connect input A net connect A A 1 1 1 1 pstc 0 1 Recurrently connect the population with the connection matrix for a simple harmonic oscillator mapped to neurons with the NEF net add_to_nengo 2 3 5 Controlled Oscillator Purpose This demo implements a controlled two dimensional oscillator Comments This is functionally the analogous to the controlled integrator but in a 2 D space The available slider allows the frequency to be directly controlled to be negative or positive This behavior maps directly to the differential equation used to describe a simple harmonic oscillator i e t Ax t Bu t where A 0 freq freq 0
106. ncrease after a certain point This is because all neural ensembles have a range of values they can represent the radius and cannot accurately represent outside of that range 1 4 Feedback and Dynamics 25 Nengo Documentation Release 1 3 in A put N 2 1 00 0 471 1 471 0 0000 t D 1 6380 data mode _ tme step speed recording time filter time shown layout defaun M 0 001 joa 44 oo Woe e Adjust the radius of the ensemble to 1 5 using either the Configure interface or the script console that radii 1 5 Run the model again It should now accurately integrate up to a maximum of 1 5 in A put 0 582 1 582 0 0000 t Q 1 6080 data mode Umestep speed recording time filter time shown layout defaun 0 002 M o 1x x 4E 0 034 H a 1 4 3 Complex Input e We can also run the model with a more complex input Change the Function input using the following command from the script console after clicking on it in the black model editing mode interface Press Ctrl 1 Command 1 OS X to show the script console that functions ca nengo math impl PiecewiseConstantFunction 0 2 0 3 0 44 0 54 0 8 0 9 0 5 0 e You can see what this function looks like by right clicking on it in the editing interface and selecting plot function Note Function index 0 Start 0 Increment 0 001 End 1 26 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3
107. ndows you can do nengo cl experiment py This will run whatever script is in experiment py 114 Chapter 4 Advanced Nengo Usage Nengo Documentation Release 1 3 4 4 Running Nengo in Matlab Since Nengo is a Java application it can be directly embedded within Matlab This allows for stimuli and data analysis to be performed within Matlab which may be more familiar to some users 4 4 1 Step 1 Set up your Matlab classpath Matlab needs to know where to find all the java files needed for running Nengo You do this by adding the names of all the jar files to Matlab s classpath txt file You can edit this file from within Matlab by typing edit classpath txt You need to add a bunch of lines to the end of this file that list all the jar files found in your installation of Nengo and their full path For example Nengo nengo 1074 nengo 1074 jar Nengo nengo 1074 lib Blas jar Nengo nengo 1074 lib colt jar Nengo nengo 1074 lib commons collections 3 2 jar Nengo nengo 1074 lib formsrt jar Nengo nengo 1074 lib iText 5 0 5 jar Nengo nengo 1074 lib Jama 1 0 2 Jjar Nengo nengo 1074 lib jcommon 1 0 0 jar Nengo nengo 1074 lib jfreechart 1 0 1 jar Nengo nengo 1074 lib jmatio jar Nengo nengo 1074 lib jung 1 7 6 jar Nengo nengo 1074 lib jython jar Nengo nengo 1074 lib log4j 1 2 14 jar Nengo nengo 1074 lib piccolo jar Nengo nengo 1074 lib piccolox jar Nengo nengo 1074 lib qdox 1 6 3 jar tNengo nengo 1074 lib ssj jar tneng
108. ngo instead view play False Creates the interactive mode viewer for running the network Parameters play False or float Automatically starts the simulation running stopping after the given amount of time set_layout view layout control Defines the graphical layout for the interactive plots You can use this to specify a particular layout This will replace the currently saved layout if any Useful when running a script on a new computer that does not have a previously saved layout saving you from also copying over that layout file The arguments for this function call are generally made by opening up interacive plots making the layout you want saving the layout and then copying the text in layouts lt networkname gt layout Parameters e view dictionary parameters for the window position e layout list list of all components to be shown and their parameters e control dictionary configuration parameters for the simulation 3 8 List of Classes 107 Nengo Documentation Release 1 3 add node Add the node to the network This is generally only used for manually created nodes not ones created by calling nef Network make or nef Network make_input as these are automatically added A common usage is with nef Simp1leNode objects as in the following node net add MyNode name Parameters node the node to be added Returns node remove node Remove nodes from
109. nsion is 1 the index of the second ensemble s first dimension is 2 and so on Parameters e name string name of the ensemble array must be unique e neurons integer number of neurons in each ensemble e length integer number of ensembles in the array e dimensions integer number of dimensions each ensemble represents Returns the newly created ca nengo model impl NetworkArrayImpl make_input name values zero_after_time None Create and return a FunctionInput of dimensionality Len values with values as its constants Python functions can be provided instead of fixed values Parameters e name string name of created node e values list function string or dict numerical values for the function If a list can contain a mixture of floats and functions floats are fixed input values and functions are called with the current time and must return a single float If values is a function will be called with the current time and can return either a single float or a list of floats If a string will be treated as a filename of a csv file with the first number in each row indicating the time and the other numbers giving the value of the function If a dictionary the keys value pairs in the dictionary will be treated as time value pairs e zero_after_time float or None if not None any fixed input value will change to 0 after this amount of time Returns the created FunctionInput make_fourier_inpu
110. nspector or select the ensemble and click the magnifying glass in the top right ste NEFEnsemblelmp g approximatorFactory ApproximatorFacto i collectSpikesRatio int i dimension int R O directModeDynamics DynamicalSystem a directModelintegrator Integrator abe documentation String A encoders float F o ensembleFactory NEFEnsembleFactory a h mode SimulationMode abc name String i neurons int gt nodes Node origin Origin X plasticityinterval float Aw plasticityRule PlasticityRule al X radii float e neurons number of neurons this will rebuild the whole ensemble e radii the range of values that can be encoded can be different for different dimensions e encoders preferred direction vectors for each neuron 1 1 7 The Script Console e Nengo also allows users to interact with the model via a scripting interface using the Python language This can be useful for writing scripts to create components of models that you use often e You can also use it to inspect and modify various aspects of the model e Press Ctrl 1 Command 1 OS X or choose View gt Toggle Script Console to show the script interface The full flexibility of the Python programming language is available in this console It interfaces to the underlying Java code of the simulation using Jython making all Java methods available 1 1 One Dimen
111. o Documentation Release 1 3 return self RED_CIRCLE elif 0 35 lt self t lt 0 5 return self BLUE_SQUARE elif self t gt 0 5 and 0 2 lt self t 0 5 0 6 lt 0 4 return self RED elif self t gt 0 5 and 0 4 lt self t 0 5 0 6 lt 0 6 return self SQUARE else return self zero Add the input to the network inv Input inv net add inv prods nps ProductionSet This is an older way of implementing an SPA see SPA routing examples using the nps code directly prods add dict visual STATEMENT dict visual_to_wm True prods add dict visual QUESTION dict wm_deconv_visual_to_motor True subdim 4 model nps NPS net prods D direct_convolution False direct_buffer visual neurons_buffer N subdim subdimensions subdim model add_buffer_feedback wm 1 pstc 0 4 net connect inv getOrigin x buffer_visual Rename objects for display purposes net network getNode prod name thalamus net network getNode buffer_visual name visual net network getNode buffer_wm name memory net network getNode buffer_motor name motor net network getNode channel_visual_to_wm name channel net network getNode wm_deconv_visual_to_motor name x net network getNode gate_visual_wm name gatel net network getNode gate_wm_visual_motor
112. o nengo 1074 python jar jbullet jar tnengo nengo 1074 python jar jpct jar tnengo nengo 1074 python jar vecmath jar M M eh CP ote Cr CE Cr oe CF er er or or oe ee er oes HKHu0UU VV V VV Vv VV Vv VV VV V UD EE PO A PO PO Oe EG OPE PO gs E a a a a a a a a a a a a a a a a a a a a M This is technically all that needs to be done to run Nengo inside Matlab Once you restart Matlab you should be able to do import ca nengo model impl network NetworkImpl network setName test All the basic functionality of Nengo is exposed in this way You may also want to increase the memory available to Java To do this create a file called java opts in the Matlab startup directory with the following command in it Xmx1500m This will give Java a maximum of 1500MB of memory 4 4 2 Step 2 Connecting Python and Matlab Since we re used to creating models using the Python scripting system we want to do the same in Matlab To set this up we do the following in Matlab 4 4 Running Nengo in Matlab 115 Nengo Documentation Release 1 3 import org python util python PythonInterpreter python exec import sys sys path append nengo 1074 python The path specified on the last line must be your path to the python directory in your Nengo installation You can now run the same Python scripts that you can in the Nengo scripting system python execfile addition
113. o_nengo 2 5 3 Learning to Compute the Square of a Vector Purpose This is demo shows learning a nonlinear function of a vector Comments The set up here is very similar to the Learning a Communication Channel demo The main difference is that this demo works in a 2D vector space instead of a scalar and that it is learning to compute a nonlinear function the element wise square of its input Usage When you run the network it automatically has a random white noise input injected into it in both dimensions Turn learning on To allow the learning rule to work you need to move the switch to 1 Monitor the error When the simulation starts and learning is on the error is high The average error slowly begins to decrease as the simulation continues After 15s or so of simulation it will do a reasonable job of computing the square and the error in both dimensions should be quite small Is it working To see if the right function is being computed compare the pre and post population value graphs You should note that post looks kind of like an absolute value of pre the post will be a bit squashed You can also check that both graphs of either dimension should hit zero at about the same time Output See the screen capture below 2 5 Learning 73 Nengo Documeniation Release 1 3 AAA Learn Square switch re error post pre_00 1 switch _ gt
114. od else spikes append False return spikes measure the spike rate of a whole population for a given represented value x def compute_response x encoder gain bias time_limit 0 5 N len encoder number of neurons v 0 N voltage ref 0 N refractory period compute input corresponding to x input for i in range N input append x encoder i gain i bias i v i random uniform 0 1 randomize the initial voltage level count 0 N spike count for each neuron feed the input into the population for a given amount of time t 0 while t lt time_limit spikes run_neurons input v ref for i s in enumerate spikes if s count i 1 t dt return c time_limit for c in count return the spike rate in Hz compute the tuning curves for a population def compute_tuning_curves encoder gain bias generate a set of x values to sample at 129 Nengo Documentation Release 1 3 x_values i 2 0 N_samples 1 0 for i in range N_samples build up a matrix of neural responses to each input i e tuning curves A for x in x_values response compute_response x ncoder gain bias A append response return x_values A compute decoders import numpy def compute_decoder encoder gain bias function lambda x x get the tuning curves x_values A compute_tuning_curves encoder gain bias get the desired decoded value for each sample point
115. often be much faster to test the algorithm without neurons in direct mode before switching to a realistic neural model Note When using direct mode you may want to decrease the number of neurons in the population to 1 as this makes it much faster to create the ensemble 3 7 5 Arrays of ensembles When building models that represent large numbers of dimensions it is sometimes useful to break an ensemble down into sub ensembles each of which represent a subset of dimensions Instead of building one large ensemble to repre sent 100 dimensions we might have 10 ensembles that represent 10 dimensions each or 100 ensembles representing 1 dimension each The main advantage of this is speed It is much faster for the NEF methods to compute decoders for many small ensembles rather than one big one However there is one large disadvantage you cannot compute nonlinear functions that use values in two different ensembles One of the core claims of the NEF is that we can only approximate nonlinear functions of two or more variables if there are neurons that respond to both dimensions However it is still possible to compute any linear function We create an array by specifying its length and optionally the number of dimensions per ensemble the default is 1 net make_array M neurons 100 length 10 dimensions 1 You can also use all of the parameters available in nef Network make to configure the properties of the neu
116. ognitive Models 37 Nengo Documentation Release 1 3 If a statement is in the visual area move it to working memory as in the previous example If a question is in the visual area unbind it with working memory and place the result in the motor area e This example requires a much larger simulation than any of the others in this tutorial more than 50 000 neurons If you run this script Nengo may take a long time hours to solve for the decoders and neural connection weights needed We have pre computed the larger of these networks for you and they can be downloaded at http ctn uwaterloo ca cnrglab f question zip open demo question py 0 74red 0 74question 0 a 0 35blue circle 0 30statemen 0 32triangle 1 1 0 094 0 594 Si 0 094 0 594 0 094 0 59 buffer_visual buffer_wm buffer_motor ie ta a channel_visual_to wm wm _deconv_visual_to_motor 0 0000 c w 0 5930 data mode time step speed _ recording time filter time shown layout 3 001 foax fe aE oH osx e 38 Chapter 1 Nengo Tutorial CHAPTER TWO NENGO DEMOS This section describes the collection of demos that comes with Nengo To use any of these demo scripts in Nengo do the following e Open Open any lt demo gt py file by clicking on the icon or going to File gt Open from filein the menu and selecting the file from demos directory in your Nengo installation WY e Run Run the demo by selecting the network created in
117. ogram or you can compile each individual code sample on its own there are a lot of them so you probably won t want to compile ALL of them this way just the ones that seem interesting Just cd into any of the directo ries under C src and type make there If compilation succeeds a binary executable file will be created in C bin linux release To run any sample program cd into C bin linux release and type lt name of program gt If the program in question was compiled properly you should see a bunch of output about what computations are being performed as well as either a PASS or a FAIL FAIL s are bad Some useful samples are deviceQueryDrv Simple test to make sure CUDA programs have access to the GPU Also displays useful information about the CUDA enabled GPUs on your system if it can find and access them bandwidthTest Tests bandwidth between CPU and GPU This bandwidth can sometimes be a bottleneck of the NengoGPU implementation Online you can usually find bandwidth benchmarks which say roughly what the bandwidth should be for a given card If your bandwidth is much lower than the benchmark for your card there may be a problem with your setup simpleMultiGPU Useful if your system has multiple GPUs Tests whether they can all be used together xxx Step 4 Install CULA Dense only required if installing NengoUtilsGPU This step is very similar to Step 2 Install CUDA Toolkit 1 Be sure you have downloaded
118. ollows c atb same as in matlab c axb same as in matlab b cos a computes cosine of all values ina other known functions add subtract multiply divide remainder power 3 7 Scripting Tips 99 Nengo Documentation Release 1 3 arccos arccosh arcsinh arctan arctanh ceil conjugate imaginary COS Cosh exp floor tog legi0 real sin Sinh sqrt tan tanh maximum minimum equal not_equal less less_equal greater greater_equal logical_and logical_or logical_xor logical_not bitwise_and bitwise_or bitwise_xor SH SH SHE SHH You can also create particular arrays arange 5 same as array range 5 0 1 2 3 4 arange 2 5 same as array range 2 5 2 3 4 eye 5 5x5 identity matrix ones 3 2 3x2 matrix of all 1 ones 3 2 typecode f 3x2 matrix of all 1 0 floating point values zeros 3 2 3x2 matrix of all O The following functions help manipulate the shape of a matrix a shape get the current size of the matrix b reshape a 3 4 convert to a 3x4 matrix must already have 12 elements b resize a 3 4 convert to a 3x4 matrix can start at any size b ravel a convert to a 1 D vector b diag 1 2 3 create a diagonal matrix with the given values Some basic linear algebra operations are available c dot a b c dot a a T c innerproduct a a c convolve a b And a Fourier transform b fft a a ifft b The follow
119. ommunication channel 0 to an integrator 1 with various low pass filtering occurring in between Usage When you run this demo it will automatically put in some step functions on the input so you can see that the output is integrating i e summing over time the input You can also input your own values It is quite sensitive like the integrator But if you reduce the control input below 1 it will not continuously add its input but slowly allow that input to leak away It s interesting to rerun the simulation from the start but decreasing the control input before the automatic input starts to see the effects of leaky integration Output See the screen capture below Controlled Integrator control A ae oe cae control 1 3 0 26 lt A 0 137 1 137 1 0 137 1 137 10 input 0 00 10 0 137 1 137 0 0000 _ 1 1360 Code import nef net nef Network Controlled Integrator Create the network object net make_input input 0 2 5 0 3 0 0 44 10 0 54 0 0 8 5 0 9 0 Create a controllable input function with a default function that goes to 5 at time 0 2s at time 0 3s and so on net make_input control 1 Create a controllable input function with a starting value of 1 net make A 225 2 radius 1 5 Make a population with 225 neurons 2 dimensions anda larger radius to accommodate large simulataneous inputs net connect input A transform 0 1 0 p
120. on the icon to toggle the script console or press Ct r1 P to jump to it You can now type script commands that will be immediately run For example you can type this at the console print Hello from Nengo When using the script console you can refer to the currently selected object using the word that For example if you click on a component of your model it should be highlighted in yellow to indicate it is selected you can get its name by typing print that name You can also change aspects of then model this way For example click on an ensemble of neurons and change the number of neurons in that ensemble by typing that neurons 75 87 Nengo Documentation Release 1 3 You can also run a script file from the console with this command with script_name replaced by the name of the script to run run script_name py Pressing the up and down arrows will scroll through the history of your console commands 3 1 3 Running scripts from the command line You can also run scripts from the command line This allows you to simply run the model without running Nengo itself To do this instead of running nengo with nengo or nengo bat on Windows do nengo cl script_name py VY Of course since this bypasses the Nengo graphical interface you won t be able to click on the icon to show the model Instead you should add this to the end of your model net view 3 1 4 Importing other libraries The scripting sy
121. onnect A B func powers def product x return x 0 x 1 net connect A B func product e origin name string The name of the origin to create to compute the given function Ignored if func is None If an origin with this name already exists the existing origin is used instead of creating a new one e weight_func function or None if not None converts the connection to use an explicit connection weight matrix between each neuron in the ensembles This is mathematically identical to the default method which simply uses the stored encoders and decoders for the ensembles but much slower since we are no longer taking advantage of the factorable weight matrix However using weight_func also allows explicit control over the individual connection weights as the computed weight matrix is passed to weight_func which can make changes to the matrix before returning it If weight_func is a function taking one argument it is passed the calculated weight matrix If it is a function taking two arguments it is passed the encoder and decoder e expose_weights if True set weight_func to the identity function This makes the con nection use explicit connection weights but doesn t modify them in any way Ignored if weight_func is not None e modulatory boolean whether the created connection should be marked as modulatory meaning that it does not directly affect the input current to the neurons but instead may affect intern
122. operator This shows that convolution can be used to transform representations via binding and unbinding since a a b is approximately b 1 5 3 Control and Action Selection Basal Ganglia Note Much of what follows is summarized in Chp 5 of How to build a brain and can be constructed using the drag and drop templates in Nengo specifically Basal Ganglia BG Rule Thalamus Binding Gate and Integrator Also see the relevant demos section of the documentation http nengo ca docs html demos demos html basal ganglia based simulations Pretty much every cognitive model has an action selection component with these features Out of many possible things you could do right now pick one Usually mapped on to the basal ganglia Some sort of winner take all calculation based on how suitable the various possible actions are to the current situation Input A vector representing how good each action is for example 0 2 0 3 0 9 0 1 0 7 Output Which action to take 0 0 1 0 0 Actually the output from the basal ganglia is inhibitory so the output is more like 1 1 0 1 1 Implementation Could try doing it as a direct function x Highly non linear function Low accuracy Could do it by setting up inhibitory interconnections x Like the integrator but any value above zero would also act to decrease the others x Often used in non spiking neural networks e g PDP to do k w
123. ork compute_transform dim_pre dim_post weight 1 index_pre None index_post None Helper function used by nef Network connect to create a dim_pre by dim_post matrix All values are either 0 or weight index_pre and index_post are used to determine which values are non zero and indicate which dimensions of the pre synaptic ensemble should be routed to which dimensions of the post synaptic ensemble For example with dim_pre 2 and dim_post 3 index_pre 0 1 index_post 0 1 means to take the first two dimensions of pre and send them to the first two dimensions of post giv ing a transform matrix of 1 0 0 1 0 0 If an index is None the full range 0 1 2 N is assumed so the above example could just be index_post 0 1 Parameters e dim_pre integer first dimension of transform matrix e dim_post integer second dimension of transform matrix e weight float the non zero value to put into the matrix e index_pre list of integers or a single integer the indexes of the pre synaptic dimensions to use e index_post list of integers or a single integer the indexes of the post synaptic dimen sions to use Returns a two dimensional transform matrix performing the requested routing connect pre post transform None weight 1 index_pre None index_post None pstc 0 01 func None weight_func None expose_weights False origin_name None modula tory False plastic_array False create_projection T
124. ork object net make_input input A 0 Create a controllable input function with a starting value of 0 net make_input input B 0 Create another controllable input function with a starting value of 0 net make A 100 1 Make a population with 100 neurons 1 dimension net make B 100 1 Make a population with 100 neurons 1 dimension net make C 100 1 Make a population with 100 neurons 1 dimension net connect input A A Connect all the relevant objects net connect input B B net connect A C net connect B C net add_to_nengo 2 2 4 Combining 1D Representations into a 2D Representation Purpose This demo shows how to construct a network that combines two 1D inputs into a 2D representation Comments This can be thought of as two communication channels projecting to a third population but instead of combining the input as in addition the receiving population represents them as being independent Usage Grab the slider controls and move them up and down to see the effects of increasing or decreasing input Notice that the output population represents both dimensions of the input independently as can be seen by the fact that each input slider only changes one dimension in the output Output See the screen capture below 46 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 Combining 2 0 03 2 983 3 483 a
125. orm useful computations e g soft normalization Output See the screen capture below Integrator 1 1 0 1 10 Q input 0 00 10 0 1 0 0000 kk CO gt i 0 9730 pdf data mode timestep speed recording time filter time shown layout Al defauts 0 001 13 O ix i 40 0 03 E 1 ae Code import nef net make_input input 0 2 5 0 3 0 0 44 10 net net net net nef Network Integrator Create the network object 0 5470 0 825 02920 3 Create a controllable input function with a default function that goes to 5 at time 0 2s to 0 at time 0 3s and so on make A 100 1 quick True Make a population with 100 neurons 1 dimension connect input A weight 0 1 pstc 0 1 Connect the input to the integrator scaling the input by 1 postsynaptic time constant is 10ms connect A A pstc 0 1 Connect the population to itself with the default weight of 1 add_to_nengo 2 3 Dynamics 51 Nengo Documentation Release 1 3 2 3 2 Controlled Integrator Purpose This demo implements a controlled one dimensional neural integrator Comments This is the first example of a controlled dynamic network in the demos This is the same as the integrator circuit but we now have introduced a population that explicitly effects how well the circuit remembers its past state The control input can be used to make the integrator change from a pure c
126. ozoom 0 last maxy 1 0 y 29 7neurons preferred directions x 558 height 200 label 0 111 Nengo Documentation Release 1 3 width 200 ys LATI y neurons XY plot xX width 200 autohide 1 last_maxy 1 0 selodim 0 Ll y s 148 label 0 x 346 Therght 200 autozoom i 1 input control xs 67 height 200 range 1 0 label 1 Limits 1 widths 120 lametsow 0 y 145 1 7 Sim spd 4 red time 4 0 filter 0 03 atre 0 show time 0 5 This ability is useful when sending a model to someone else so that they will automatically see the particular set of graphs you specify This can be easier than also sending the layout file 4 2 Creating a Drag And Drop Template Nengo comes with a variety of templates pre built components that can be used to build your models These are the various icons on the left side of the screen that can be dragged in to your model These components are defined in python nef templates There is one file for each item and the following example uses thalamus py The file starts with basic information including the full name title of the component the text to be used in the interface label and an image to use as an icon The image should be stored in images nengoIcons title Thalamus label Thalamus icon tha
127. pen demo sequencenogate py StrD2 buffer_memory 0 49C 0 49letter B 0 48D 0 48letter C 0 48A B buffer_memory SSN pa GPe 1 l buffer_visual gt StD A CPi i channel_visual_to_memory thal_memory buffer_visual 74letter B 0 7 4letter 0 74D E 0 741 1 m 1 0 007 0 507 0 007 0 507 9 007 0 507 0 0000 t y 2 3970 e Problem If this connection always exists then the visual input will always override what s in working memory This connection needs to be controllable Solution Actions need to be able to control the flow of information between cortical areas Instead of sending a particular SP to working memory we need IF X THEN transfer the pattern in cortex area Y to cortex area Z In this case we add a rule that says IF it contains a letter transfer the data from the visual area to working memory We make the utility of the rule lower than the utility of the sequence rules so that it will only transfer that information open that gate when no other action applies open demo sequencerouted py 36 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 StrD2 buffer_memory a STN ar GPe buffer memory 0 51D 0 5 1A C 0 35E 0 35B C 0 35A D l buffer_visual a gt StrD1 A T channel_visual_to_memory thal_memory 4 A gate_visual_memory buffer_visual 4 Luar 0 04A E prod me 2 o Os o 0 5 0 0 5 0 0000 t 5 0
128. pulation with 100 neurons 1 dimensions a radius of 10 default is 1 net make Combined 225 2 radius 15 Make a population with 225 neurons 2 dimensions and set a larger radius so 10 10 input still fits within the circle of that radius net make D 100 1 radius 100 Make a population with 100 neurons 1 dimensions a radius of 10 default is 1 48 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 net connect input A A Connect all the relevant objects net connect input B B net connect A Combined transform 1 0 Connect with the given 1x2D mapping matrix net connect B Combined transform 0 1 def product x return x 0 x 1 net connect Combined D func product Create the output connection mapping the 1D function product net add_to_nengo 2 2 6 Circular Convolution Purpose This demo shows how to exploit the hrr library to do binding of vector representations Comments The binding operator we use is circular convolution This example is in a 10 dimensional space This or any similar binding operator see work on vector symbolic architectures VSAs is important for cognitive models This is because such operators lets you construct structured representations in a high dimensional vector space Usage The best way to change the input is to right click the semantic pointer graphs and choose set value
129. py or if the file is in another directory python execfile nengo demo addition py We can also execute single python commands like this python exec net view play 0 5 4 4 3 Step 3 Running a simulation and gathering data However what we really want to do is use Matlab to run a simulation and gather whatever data we need Here is how we can do that First we need to know how to get access to objects created in the python script For example if we made an ensemble called C with C net make C 100 1 in the script we can get this object in Matlab as follows C javaMethod __tojava__ python get C java lang Class forName java lang Object Now we can get the value of its origin by something like C getOrigin xX getValues getValues With this in mind the following Matlab code runs a simulation and gathers the output from C s X origin python execfile addition py get th nsemble C C javaMethod __tojava__ python get C java lang Class forName java lang Object run the simulation t 0 0 dt 0 001 data w waitbar 0 Running simulation while t lt 1 waitbar t command sprintf net network simulator run f f f t t dt dt python exec command data data C getOrigin X getValues getValues t ttdt end close w 4 5 Generating Large Ensembles 500 to 5000 neurons
130. r from com bulletphysics linearmath import Transform from javax vecmath import Vector3f dt 0 001 N 1 pstc 0 01 net nef Network simple arm controller class getShoulder ca nengo math Function def map self X x float xX 0 y float X 1 make sure we re in the unit circle if sqrt x 2 yx 2 gt 1 x x sqrt x 2 y x 2 y y sqrt x 2 y 2 ll L1 5 L2 5 EPS le 10 if xx 2tyx 2 lt L1 2 L2 2 D D find elbow down angles from shoulder to elbow java lang System out printin x 2f yitt S x y if D lt 1 and D gt l elbow acos D else elbow 0 if xx 2 y 2 lt Ll 2 L2 2 lbow pi lbow if x 0 and y 0 y y EPS shoulder 1 5708 asin inside magic numbers from matlab inside L2xsin elbow sqrt x 2 y 2 if inside gt 1 inside 1 if inside lt 1 inside 1 if x 0 else shoulder atan y x asin inside if x lt 0 shoulder shoulder pi return shoulder def getDimension self return 2 class getElbow ca nengo math Function def map self X D x 2 yxe2 L1x 2 L2 2 2 L1 L2 law of cosines 2 6 Miscellaneous 79 Nengo Documentation Release 1 3 x float X 0 y float X 1 make sure we re in the unit circle if sqrt x 2 y 2 gt 1 x x sqrt x 2 y x 2 y y sqrt x 2 y 2 Ll 5 L2 5 D x 2 yxe2 L1x 2 L
131. r as origins but these functions take an input value usually denoted x which is an array of floats containing the input When definining the termination we also have to define the number of dimensions expected We do this by setting the dimensions parameter which defaults to 1 We can also specify the post synaptic time constant at this termination by setting the pstc parameter default is None For example the following object takes a 5 dimensional input vector and outputs the largest of the received values class Largest nef SimpleNode def init self self largest 0 94 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 def termination_values self x dimensions 5 pstc 0 01 self largest max x def origin_largest self return self largest net nef Network largest net make_input input 0 5 largest net add Largest largest net connect input largest getTermination values Note When making a component like this make sure to define an initial value for largest or whatever internal parameter is being used to map inputs to outputs inside the init self function This function will be called before the origins are evaluated so that there is a valid self largest return value 3 5 4 Arbitrary Code You can also define a function that will be called every time step but which is not tied to a particular Origin or Termination This function is
132. r gt export PATH and then restart your bash shell You can type printenv to see whether the changes have taken effect Step 3 Install CUDA code samples optional This step is not strictly necessary but can help to ensure that your driver is installed properly and that CUDA code has access to the GPU and can be useful in troubleshooting various other problems 1 Be sure you have downloaded the CUDA code sample installer for your system from the link provided above Note where the file gets downloaded 2 Run the installer with sudo sh lt samples installer name gt Your home directory is generally a good place to install it The installer cre ates a folder called NVIDA_GPU_Computing_SDK in the location you chose NVIDIA_GPU_COMPUTING_SDK C src contains a series of subdirectories each of which contains CUDA source code which can be compiled into binary files and then executed 3 cd into NVIDA_GPU_Computing_SDK C and enter the command make This will compile the many CUDA source code samples in the C src directory creating a series of executable binaries in C bin linux release Sometimes make may fail to compile one of the programs which will halt the 4 6 GPU Computing in Nengo 119 Nengo Documentation Release 1 3 entire compilation process Thus all programs which would have been compiled after the failed program will remain uncompiled To get around this you can either fix the compilation issues with the failed pr
133. r projection from a 2D to a 1D space i e multiplication Usage When you run the network it automatically has a random white noise input injected into it in both dimensions Turn learning on To allow the learning rule to work you need to move the switch to 1 Monitor the error When the simulation starts and learning is on the error is high After about 10s it will do a reasonable job of computing the produt and the error should be quite small Is it working To see if the right function is being computed compare the pre and post population value graphs You should note that if either dimension in the input is small the output will be small Only when both dimensions have larger absolute values does the output go away from zero see the screen capture below Output See the screen capture below 2 5 Learning 71 Nengo Documentation Release 1 3 AAA Learn Product switch O error post pre_00 1 switch _ gt Gate v vo ere E n ees mn a ese ay 1 05 error 1 input 9 59 10 09 O input gt _ pre gt _ post 0 21 0 40 input pre X post 1 1 1 AMAN i a n foie 1 1 1 9 59 10 09 9 59 10 09 gs9 10 09 9 4170 ka ED PPI 10 4170 Code N 60 D 2 import nef import nef templates learned_termination as learning import nef templates gate as gating import random random seed 37 net nef Network
134. rue encoders None Connect two nodes in the network pre and post can be strings giving the names of the nodes or they can be the nodes themselves Func tionInputs and NEFEnsembles are supported They can also be actual Origins or Terminations or any combination of the above If post is set to an integer or None an origin will be created on the pre popula tion but no other action will be taken pstc is the post synaptic time constant of the new Termination If transform is not None it is used as the transformation matrix for the new termination You can also use weight index_pre and index_post to define a transformation matrix instead weight gives the value and index_pre and index_post identify which dimensions to connect see nef Network compute_transform for more details For example net connect A B weight 5 with both A and B as 2 dimensional ensembles will use 5 0 0 5 as the transform Also you can do 104 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 net connect A B index_pre 2 index_post 5 to connect the 3rd element in A to the 6th in B You can also do net connect A B index_pre 0 1 2 index_post 5 6 7 to connect multiple elements If func is not None a new Origin will be created on the pre synaptic ensemble that will compute the provided function The name of this origin will taken from the name of the function or origin_name if provided If an origin wit
135. run population A and determine which neurons spike spikes_A run_neurons input_A v_A ref_A decay all of the inputs implementing the post synaptic filter for j in range N_B input_B j 1 0 pstc_scale for each neuron that spikes increase the input current of all the neurons it is connected to by the synaptic connection weight for i s in enumerate spikes_A if s for j in range N_B input_B j weights i j pstc_scale compute the total input into each neuron in population B taking into account gain and bias total_B 0 N_B for j in range N_B total_B j gain_B j input_B j bias_B j run population B and determine which neurons spike spikes_B run_neurons total_B v_B ref_B for each neuron in B that spikes update our decoded value also applying the same post synaptic filter output 1 0 pstc_scale for j s in enumerate spikes_B if s output decoder_B j 0 pstc_scale print t output times append t inputs append x outputs append output ideal append function x t dt Heda HHH HEH HEHEHE aaa Ea Ra a eH aE PER EE Step 3 Plot the results HHH AHHH EHH HEHEHE EEA Ea a Ra a aE ER HE x A x B ll compute_tuning_curves encoder_A gain_A bias_A compute_tuning_curves encoder_B gain_B bias_B ll import pylab pylab figure pylab plot x A pylab title Tuning curves for population A 131 Nengo Documentation Release 1 3 pyla
136. s other than 0 net make_input inputD values 0 5 0 3 27 3 2 Scripting Basics 89 Nengo Documentation Release 1 3 3 2 4 Computing linear functions To have components be useful they have to be connected to each other To assist this process the nef Network connect function will create the necessary Origins and or Terminations as well as the Pro jection net make A 100 1 net make B 100 1 net connect A B You can also specify a transformation matrix to allow for the computation of any linear function and a post synaptic time constant net make A 100 2 net make B 100 3 net connect A B transform 0 0 5 1 0 0 0 5 pstc 0 03 3 2 5 Computing nonlinear functions To compute nonlinear functions you can specify a function to compute and an origin will automatically be created net make A 100 1 net make B 100 1 def square x return x 0 x 0 net connect A B func square This also works for highly complex functions net make A 100 5 net make B 100 1 import math def strange x if x 0 lt 0 4 return 0 3 elif x 1 x 2 lt 0 3 return math sin x 3 else return x 4 net connect A B func strange Nengo will automatically solve for the decoders and connection weights needed to approximate these highly complex functions 3 3 Configuring Neural Ensembles When creating a ne
137. s very quickly since Nengo takes advantage of these identical ensembles and does not need to re compute decoders However it leads to neurally implausible models since neurons are not identical in this way As an alternative you can use the second way of specifying seeds the seed parameter net Network My Network seed 5 With this approach Nengo will create a new seed for every ensemble based on your initial seed value There will be no identical neurons in your model However if you re run your script Nengo will re generate exactly the same neurons and so will be able to re use the previously computed decoders 3 4 2 Sharing Consistent Models A useful side effect of the seed parameter is that it allows us to be sure that exactly the same model is generated on different computers That is if you build a model and set a specific seed then other researchers can run that same script and will get exactly the same model as you created In this way we can ensure consistent performance 3 4 3 Individual Differences One way to think about the seed parameter is as a way to generate neural models with individual differences The same script run with a different seed value will create a slightly different model due to the low level neural differ ences This may affect the overall behaviour 3 5 Adding Arbitary Code to a Model Nengo models are composed of Nodes Each Node has Origins outputs and Terminations inputs allowing them
138. script requires Python http www python org and Numpy http numpy scipy org to run and Matplotlib http matplotlib org to produce the output graphs For more information on the Neural Engineering Framework and the Nengo Se SR SR OH RE SRE SE software please see http nengo ca import random import math Hedtheteteteata tata tad tt tad tat tat tat ttt tat Ht dH te Hae EE Parameters Hedeteteteteata tad aea ta te tat ad ta tat tat dtd tae htt te tate aH A dt 0 001 tire 0 02 simulation time step membrane RC time constant t_ref 0 002 refractory period t_pstc 0 1 post synaptic time constant N_A 50 number of neurons in first population N_B 40 number of neurons in second population N_samples 100 number of sample points to use when finding decoders rate_A 25 75 range of maximum firing rates for population A rate_B 50 100 range of maximum firing rates for population B the input to the system over tim def input t return math sin t the function to compute between A and B def function x return x x Heed etaeeeaetae a deat datas tat tat deat ded tat dtd tat ta te aaa aE Step 1 Initialization HEHEHE HEHEHE EAE a RE EEA Ra aaa a aE EH HE create random encoders for the two populations encoder_A random choice 1 1 for i in range N_A encoder_B random choice 1 1 for i in range N_B def generate_gain_and_bias count intercept_low intercept_high rate
139. se to the simulation To make the inputs to neurons noisy you can specify an amount of noise and a noise frequency how often a new noise value is sampled from the uniform distribution between noise and noise Each neuron will sample from this distribution at this rate and add the resulting value to its input current The frequency defaults to 1000Hz net make H 50 1 noise 0 5 noise_frequency 1000 3 7 3 Random inputs Here is how you can convert an input to provide a randomly changing value rather than a constant net make_fourier_input input dimensions 1 base 0 1 high 10 power 0 5 seed 0 3 7 Scripting Tips 97 Nengo Documentation Release 1 3 This will produce a randomly varying input This input will consist of random sine waves varying from 0 1Hz to 10Hz in 0 1Hz increments The random number seed used is 0 3 7 4 Changing modes spiking rate and direct You can set an ensemble to be simulated as spiking neurons rate neurons or directly no neurons The default is spiking neurons net make J neurons 1 dimensions 100 mode direct net make K neurons 50 dimensions 1 mode rate One common usage of direct mode is to quickly test out algorithms without worrying about the neural implementation This can be especially important when creating algorithms with large numbers of dimensions since they would require large numbers of neurons to simulate It can
140. shoulder net connect r getOrigin elbowAngle funcT getTermination elbow put everything in direct mode net network setMode ca nengo model SimulationMode DIR except the last population ECT Chapter 2 Nengo Demos Nengo Documentation Release 1 3 controlU setMode ca nengo model SimulationMode DEFAULT 2 6 Miscellaneous 85 Nengo Documentation Release 1 3 86 Chapter 2 Nengo Demos CHAPTER THREE NENGO SCRIPTING This documentation describes the scripting libraries available for use in Nengo 3 1 Scripting Interface Many researchers prefer to create models by writing scripts rather than using a graphical user interface To accomplish this we have embedded a scripting system within Nengo The scripting language used in Nengo is Python Detailed documentation on Python syntax can be found at http www python org Nengo uses Jython http jython org to interface between the script and the underlying Java implementation of Nengo This allows us to use Python syntax and still have access to all of the underlying Java code 3 1 1 Running scripts from a file You can create scripts using your favourite text editor When saved they should have a py extension To run the script click on the icon or select File gt Open from file from the main menu 3 1 2 Running scripts from the console You can also use scripting to interact with a model within Nengo Click
141. sional Representation Nengo Documentation Release 1 3 e If you click on an object in the GUI so that it is highlighted in yellow this same object is available by the name that in the script console Click on an ensemble Open the script console type print that neurons type that neurons 50 e You can also run scripts by typing run scriptname py or by opening the script using File gt Open or by clicking the folder in the top left 1 2 Linear Transformations 1 2 1 Creating Terminations e Connections between ensembles are built using Origins and Terminations The Origin from one ensemble can be connected to the Termination on the next ensemble e Create two ensembles They can have different neural properties and different numbers of neurons but for now make sure they are both one dimensional e Drag the Termination icon from the sidebar onto the second ensemble Provide a name for example input Set the input dimension to and use Set Weights to set the connection weight to 1 Set tauPSC to 0 01 this synaptic time constant differs according to which neurotransmitter is involved 10ms is the time constant for AMPA 5 10ms Templates o new Remove Name input Weights Input Dim p Set Weights tauPSC Jo oa Is Modulatory C Enable Ok Cancel 8 Chapter 1 Nengo Tutorial Nengo Documentation Release 1 3 Editor _ 1 0 f Ok Cancel File E
142. st uses f x x Create a new ensemble and a function input The ensemble should be one dimensional with 100 neurons and a radius of 1 Use a Constant Function input set to 0 5 Create a termination on the ensemble and connect the function input to it Now create a new origin that will estimate the square of the value x BA Drag the Origin icon from the sidebar onto the ensemble Set the name to square Click on Set Functions Select User defined Function and press Set For the Expression enter x0 x0 We refer to the value as x0 because when we extend this to multiple dimensions we will refer to them as x0 x1 x2 and so on Press OK OK and OK You can now generate a plot that shows how good the ensemble is at calculating the non linearity Right click on the ensemble and select Plot gt Plot distortion square 1 3 Non Linear Transformations 19 Nengo Documentation Release 1 3 Distortion Error aa a arunsa 10 0 75 5 0 SE 00 25 5 0 75 10 0 Ideal Actual Error Start Interactive Plots Create a control for the input so you can adjust it while the model runs right click on the input and select control Create a graph of the square value from the ensemble Do this by right clicking on the ensemble in the Interactive Plots window and selecting square gt value For comparison also create a graph for the standard X origin byt right cl
143. stc 0 1 Connect all the relevant 52 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 objects with the relevant 1x2 mappings postsynaptic time constant is 10ms net connect control A transform 0 1 pstc 0 1 def feedback x return x 0 x 1 net connect A A transform 1 0 func feedback pstc 0 1 Create the recurrent connection mapping the 1D function feedback into the 2D population using the 1x2 transform net add_to_nengo 2 3 3 Controlled Integrator 2 Purpose This demo implements a controlled one dimensional neural integrator Comments This is functionally the same as the other controlled integrator However the control signal is zero for integration less than one for low pass filtering and greater than 1 for saturation This behavior maps more directly to the differential equation used to describe an integrator i e t Ax t Bu t The control in this circuit is A in that equation This is also the controlled integrator described in the book How to build a brain Usage When you run this demo it will automatically put in some step functions on the input so you can see that the output is integrating i e summing over time the input You can also input your own values It is quite sensitive like the integrator But if you reduce the control input below 0 it will not continuously add its input but slowly allow that input to leak away It s inter
144. stem allows you to make use of any Java library and any 100 Python library There are two methods for doing this in Python First you can do it this way import ca nengo utils but then when you use the library you will have to provide its full name y ca nengo utils DataUtils filter x 0 005 The other option is to import this way from ca nengo utils import x This allows you to just do y DataUtils filter x 0 005 3 2 Scripting Basics The Jython scripting interface in Nengo provides complete access to all of the underlying Java functionality This pro vides complete flexibility but requires users to manually create all of the standard components Origins Terminations Projections etc resulting in fairly repetitive code To simplify the creation of Nengo models we have developed the nef module as a wrapper around the most common functions needed for creating Nengo networks As an example the following code will create a new network with an input that feeds into ensemble A which feeds into ensemble B import nef net nef Network Test Network net add_to_nengo net make_input input values 0 88 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 net make A neurons 100 dimensions 1 net make B neurons 100 dimensions 1 net connect input A net connect A B These scripts can be created with any text editor saved with a p
145. t name dimensions None base 1 high 10 power 0 5 seed None Create and return a FunctionInput that randomly varies The variation is generated by randomly generating fourier components with frequencies that are multiples of base up to high and normalized to have an rms power of power Parameters e name string name of created node e dimensions int or None dimensionality of the input If None will default to the longest of any of the lists given in the other parameters or 1 if there are no lists e base float or list fundamental lowest frequency for the fourier series If a list will use different values for each dimension Default is 1Hz high float or list maximum frequency for the fourier series If a list will use different values for each dimension Default is 10Hz 3 8 List of Classes 103 Nengo Documentation Release 1 3 e power float or list RMS power for the random function If a list will use different values for each dimension Default is 0 5 e seed int or list or None random number seed to use If a list will use different values for each dimension If None a random seed will be chosen Returns the created FunctionInput make_subnetwork name Create and return a subnetwork Subnetworks are just Network objects that are inside other Networks and are useful for keeping a model organized Parameters name string name of created node Returns the created Netw
146. te the basis function at For example for polinomials the basis functions would be computed as def polynomial_basis index x return x xindex Parameters e node Node The Nengo component that represents the vector e basis function The set of basis functions to use This is a single function accepting two parameters the basis index and the x value It should return the corresponding y value e label string The text that will appear in the pop up menu to activate this view e minx float minimum x value to plot e maxx float maximum x value to plot e miny float minimum y value to plot e maxy float maximum y value to plot 3 8 2 nef SimpleNode nef SimpleNode alias of lt fake_packages _fake_object object at Oxlfbcd90 gt 3 8 List of Classes 109 Nengo Documentation Release 1 3 110 Chapter 3 Nengo Scripting CHAPTER FOUR ADVANCED NENGO USAGE 4 1 Interactive Plots Layout Files The interactive plots mode in Nengo shows a wide variety of information about a running Nengo model including graphs to interpret the value being represented within a network and interactive controls which allow the inputs to the system to be varied The exact configuration of this view is saved in a text file in the Layouts directory using the name of the network as an identifier For example the layout for the multiplication demo is saved as layouts Multiply layout and looks like The first
147. te_selected_cell About 10 lines below that should be var code cell get_code var msg_id that kernel execute cell get_code Change it to var code cell get_code if code indexOf nef Network 1 code from stats ipython import run_in_nengo nrun_in_nengo codet 7 var msg_id that kernel execute code 4 7 4 Running a Nengo model from IPython notebook e Go to your Nengo directory You must run Python from the same directory that Nengo is in e Run the Python interface with ipython notebook pylab e Create a New Notebook e Enter the following code to make a simple model import nef net nef Network Test net make A 50 1 net make_input input 0 1 0 2 0 5 0 4 0 0 6 0 5 0 8 1 0 net connect input A log net log log add input tau 0 log add A net run 1 0 e Run the model by pressing Ctrl Enter It should think for a bit and then output finished running xperiment Test py It creates this name based on the name of the Network being run 4 7 Integrating with IPython Notebook 123 Nengo Documentation Release 1 3 e As with running a model in Nengo the logfile should be created as Test lt date gt lt time gt csv inside your Nengo directory 4 7 5 Analyzing data from a Nengo run e As with Nengo you can use the stats module to extract data from a log file Create a new cell below your curr
148. the CULA toolkit installer for your system as mentioned in the introduction Note where the file gets downloaded 2 Run the installer with sudo sh lt CULA installer name gt The installer will ask where you want to install CULA Again the default location usr local cula is the most convenient since parts of the GPU implementation assume it will be there however we can easily change these assumptions by changing some text files so just note where you install it Be sure to set the environment variables as recommended by the installer See Step 2 Install CUDA Toolkit for the best way to do this Step 5 Compiling the shared libraries 1 cd into the directory NengoGPU For developers this is can be found in simulator src c NengoGPU For users of the prepackaged version of Nengo it should just be a subdirectory of the main Nengo folder Run configure to create the necessary symbolic links you may have to chmod this to ensure that the permissions are set to execute If you installed CUDA in a location other than the default open the file Makefile with your favourite text editor and edit it so that the variables CUDA_INC_PATH and CUDA_LIB_PATH point to the correct locations If you installed CUDA in the default location you don t have to change anything nvcc the CUDA compiler is incompatible with versions of gcc that are too new where too new isa function of the nvcc version gcc 4 4 is generally a s
149. the User defined Function and press Set Set the Expression to be x0 x0 Press OK OK and OK to finish creating the origin This new origin will calculate the square of the value represented by this ensemble 2 y 1 3 Non Linear Transformations 23 Nengo Documentation Release 1 3 If you connect this new origin to the Combined ensemble instead of the standard X origin the network will calculate xy instead of xy Note To remove the X origin projection drag it away from the Combined population Right click the end of the line and select Remove to delete the projection altogether File Edit View Options Help G Nonlinear Network Viewer foe inputA f x e inputB Nonlinear Network gt Nonlinear Network 1 4 Feedback and Dynamics 1 4 1 Storing Information Over Time Constructing an Integrator e The basis of many of our cognitive models is the integrator Mathematically the output of this network should be the integral of the inputs to this network Practically speaking this means that if the input to the network is zero then its output will stay at whatever value it is currently at This makes it the basis of a neural memory system as a representation can be stored over time Integrators are also often used in sensorimotor systems such as eye control e For an integrator a neural ensemble needs to connect to itself with a transformation weight of 1 and have an
150. the previous step and then clicking the icon in the upper right corner of the Nengo main window Alternatively right click on the network and select Interactive Plots Click the arrow to start the simulation Note You don t need to select the network if there is only one available to run e Delete Remove the demo network after using it by right clicking and selecting Remove model Note You don t need to remove a model if you reload the same script again it will automatically be replaced More sophisticated examples can be found in the Model Archive at http models nengo ca 2 1 Introductory Demos 2 1 1 A Single Neuron Purpose This demo shows how to construct and manipulate a single neuron Comments This leaky integrate and fire LIF neuron is a simple standard model of a spiking single neuron It resides inside a neural population even though there is only one neuron Usage Grab the slider control and move it up and down to see the effects of increasing or decreasing input This neuron will fire faster with more input an on neuron Output See the screen capture below 39 Nengo Documentation Release 1 3 0 OO Single Neuron input gt _ neuron 0 076 0 576 1 pan INAAKIAKA AAAS 0 076 0 576 1 0 076 0 576 0 0000 k9 aa Al 0 5750 Code import nef net nef Network Single Neuron Create the network net make_input input 0 45
151. the simulation is a non neural dynamical simulation with just one crucial population being neural That population is plotted in the visualizer as are the generated control signals The control signals are used to drive the arm which is run in the physics simulator Usage When you run the network it will reach to the target red ball If you change refX and refY that will move the ball and the arm will reach for the new target location Output See the screen capture below A A A simple arm controller T matrix gt funcT convert XY convert Angles K pa control signal u control signal v F f refx refY Room FX f 50 O 2 561 3 061 1 05 1 05 2 0600 k 3 0600 pdf data mode time step speed recording time filter time shown layout Al direct 3 0 001 3 O ix 3 16 0 0316 o5 W e Code from _ future_ import generators import sys import nef import space from ca nengo math impl import from ca nengo model plasticity impl import from ca nengo util import x from ca nengo plot import x from com bulletphysics import from com bulletphysics linearmath import x from com bulletphysics dynamics constraintsolver import x from math import import java from java awt import Color import ccm import random 78 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 random seed 11 from math import pi from com threed jpct import SimpleVecto
152. the standard Leaky Integrate and Fire neuron Clicking on Set allows for the neuron parameters to be configured tauRC RC time constant for the neuron membrane usually 0 02 tauRef absolute refractory period for the neuron usually 0 002 Max rate the maximum firing rate for the neurons each neuron will have a maximum firing rate chosen from a uniform distribution between low and high Intercept the range of possible x intercepts on the tuning curve graph normally set to 1 and 1 e Because there are many parameters to set and we often choose similar values Nengo will remember your previous settings Deprecated Also you can save templates by setting up the parameters as you like them and clicking on New in the Templates box You will then be able to go back to these settings by choosing the template from the drop down box File Edit View Options Help i Basic Network Network Viewer Basic Network Network gt Ensemble NEFEnsemble e You can double click on an ensemble to view the individual neurons within it 1 1 One Dimensional Representation 5 Nengo Documentation Release 1 3 File Edit View Options Help 3 ecr Basic Network Network Vieweg TEnsemble NEFEnsemble Viewer Ensemble NEFEnsemble gt Ensemble NEFEnsemble 1 1 4 Plotting Tuning Curves e This shows the behaviour of each neuron when it is representing different values i e the tuning curves for the neurons e Right cli
153. them and returning the new matrix The following code randomly adjusts each connection weight by an amount sampled from a normal distribution of standard deviation 0 001 net make A 100 1 net make B 100 1 def randomize w for i in range len w for j in range len w i w i 3 random gauss 0 0 001 return w net connect A B weight_func randomize 3 6 3 Sparsification We can also use this approach to enforce sparsity constraints on our networks For example one could simply find all the small weights in the w matrix and set them to zero The following code takes a slightly different approach that we have found to be a bit more robust Here if we want 20 connectivity i e each neuron in population A is only connected to 20 of the neurons in population B we simply randomly select 80 of the weights in the matrix and set them to zero To make up for this reduction in connectivity we also increase the remaining weights by scaling them by 1 0 0 2 net make A 100 1 net make B 100 1 p 0 2 def sparsify w for i in range len w for j in range len w i if random random lt p wfi j p else w i j 0 0 return w net connect A B weight_func sparsify 3 7 Scripting Tips 3 7 1 Common transformation matrices To simplify creating connection matrices for high dimensional ensembles you can use three additional parameters in the nef Network connect function
154. then press Set to define the value itself For this model set it to 0 5 Function 0 Constant Function z Set Preview function parameters not set Ok Cancel Input Dimension Value 0 5 Ok Cancel File Edit View Options Help Basic Network Network Viewer f xe external input Top Window Mouse X 927 61 Y 738 91 e Add a termination on the first neural ensemble as before and create a projection as before from the new input to that ensemble 1 2 Linear Transformations 11 Nengo Documentation Release 1 3 File Edit View Options Help m Basic Yetwork 5 Basic Network Network Viewer f x a 00e gt oto external input A B Basic Network Network gt Basic Network Network 1 2 4 Interactive Plots e To observe the performance of this model we now switch over to Interactive Plots This allows us to both graph the performance of the model and adjust its inputs on the fly to see how this affects behaviour e Start Interactive Plots by right clicking inside the Network and selecting Interactive Plots or clicking the double sine icon at the top right input gt A gt B 0 0000 q 0 0000 e The text shows the various components of your model and the arrows indicate the synaptic connections between them You can move the components by left click dragging them and you can move all the components by dragging the background You can hide a component by right clic
155. to be connected up to perform operations Nengo has built in Node types for Neurons Ensembles groups of Neurons Networks Arrays and so on However when creating a complex model we may want to create our own Node type This might be to provide custom input or output from a model or it can be used to have a non neural component within a larger model Technically the only requirement for being a Node is that an object supports the ca nengo model Node interface However for the mast majority of cases it is easier to make use of the nef Simp 1eNode wrapper class 3 5 1 SimpleNode You create a nef SimpleNode by subclassing defining functions to be called for whatever Origins and or Termi nations you want for this object The functions you define will be called once every time step usually 0 001 seconds These functions can contain arbitrary code allowing you to implement anything you want to For example the fol lowing code creates a Node that outputs a sine wave import nef import math net nef Network Sine Wave define the SimpleNode class SineWave nef SimpleNode def origin_wave self return math sin self t wave net add SineWave wave 3 5 Adding Arbitary Code to a Model 93 Nengo Documentation Release 1 3 net make neurons 100 1 connect the SimpleNode to the group of neurons net connect wave getOrigin wave neurons net add_to_nengo 3 5 2 O
156. ues for the typecode parameter fae 2Pes2 rs intl6 1 int64 CL THES f float32 d float64 F complex64 D complex128 The first important thing you can do with this array is use full slice syntax This is the notation used to access part of an array A slice is a set of three values all of which are optional a b c means to start at index a go to index b but not include index b and have a step size of c between items The default for a is 0 for b is the length of the array and c is 1 For multiple dimensions we put a comma between slices for each dimension The following examples are all for a 2D array Note that the order of the 2nd and 3rd parameters are reversed from matlab and it is all indexed starting at 0 a 0 the first row alpi the first row al 0 the first column a l3 the first three rows a 0 3 the first three columns a 3 the first three columns the leading zero is optional als 2 all columns from the 2nd to the end the end value is optional a 1 all columns except the last on negative numbers index from the end alisa just the even numbered rows skip every other row a 3 every third row a T all rows in reverse order als 222 just the even numbered columns skip every other column a l all columns in reverse order a T transpose With such an array you can perform element wise operations as f
157. unction def square x return xx xx for xx in x connect pre error func square net connect post error weight 1 Add a gate to turn learning on and off net make_input switch 0 Create a controllable input function with a starting value of 0 and 0 in the two dimensions gating make net name Gate gated error neurons 40 pstc 0 01 Make a gate population with 40 neurons and a postsynaptic time constant of 10ms net connect switch Gate net add_to_nengo 2 6 Miscellaneous 2 6 1 Simple Vehicle Purpose This demo shows an example of integrating Nengo with a physics engine renderer Comments The demo shows a simple Braitenburg vehicle controller attached to a simulated robot a Dalek from Dr Who that drives in a physical simluation Most of the code is demonstrating how to construct a simple robot and a simple room in the physical simulator Only the last few lines are used to construct the neural controller Usage Hit play and the robot will drive around avoiding the blocks and walls The spikes are generated by measure ments of distance from obstacles and motor commands to the wheels Output See the screen capture below 2 6 Miscellaneous 75 Nengo Documentation Release 1 3 OAO Braitenberg left eye gt left input aa a motor right eye gt right input right motor left motor TOOT TMA TCT COCA
158. ural ensemble a variety of parameters can be adjusted Some of these parameters are set to reflect the physiological properties of the neurons being modelled while others can be set to improve the accuracy of the transformations computed by the neurons 3 3 1 Membrane time constant and refractory period The two parameters for Leaky Integrate and Fire neurons are the membrane time constant t au_rc and the refrac tory period tau_ref These parameters are set when creating the ensemble and default to 0 02 seconds for the membrane time constant and 0 002 seconds for the refractory period 90 Chapter 3 Nengo Scripting Nengo Documentation Release 1 3 net make D 100 2 tau_rce 0 02 tau_ref 0 002 Empirical data on the membrane time constants for different types of neurons in different parts of the brain can be found at http ctn uwaterloo ca cnrglab q node 547 3 3 2 Maximum firing rate You can also specify the maximum firing rate for the neurons It should be noted that it will always be possible to force these neurons to fire faster than this specified rate Indeed the actual maximum firing rate will always be 1 tau_ref since if enough current is forced into the simulated neuron it will fire as fast as its refractory period will allow However what we can specify with this parameter is the normal operating range for the neurons More technically this is the maximum firing rate assuming that the neurons are representing
159. utputs a k a origins You can create as many outputs as you want from a SimpleNode as long as each one has a distinct name Each origin consists of a single function that will get called once per time step and must return an array of floats When defining this function it is often useful to know the current simulation time This can be accessed as self t and is the time in seconds of the beginning of the current time step the end of the current time step is self t__end class ManyOrigins nef SimpleNode an origin that is 0 for t lt 0 5 and 1 for t gt 0 5 def origin_step self if self t lt 0 5 return 0 else return 1 a triangle wave with period of 1 0 seconds def origin_triangle self x self t 1 0 if x lt 0 5 return xx2 else return 2 0 x 2 a sine wave and a cosine with frequency 10 Hz def origin_circle self theta self t 2 math pi 10 return math sin theta math cos theta When connecting a SimpleNode to other nodes we need to specify which origin we are connecting The name of the origin is determined by the function definition of the form origin_ lt name gt net make A 100 1 net make B 200 2 many net add ManyOrigins many net connect many getOrigin triangle A net connect many getOrigin circle B 3 5 3 Inputs a k a Terminations To provide input to a SimpleNode we define terminations These are done in a similar manne
160. volution make_convolution net A B C N 62 Chapter 2 Nengo Demos Nengo Documentation Release 1 3 quick True Make a convolution network using the construct populations conv2 nef convolution make_convolution net C E F N invert_second True quick True Make a correlation network by using convolution but inverting the second input CIRCLE vocab parse CIRCLE v Add elements to the vocabulary to use BLUE vocab parse BLUE v RED vocab parse RED v SQUARE vocab parse SQUARE v ZERO 0 D lt Create the inputs inputA inputA 0 0 RED inputA 0 5 BLUE inputA 1 0 RED inputA 1 5 BLUE inputA 2 0 RED inputA 2 5 BLUE inputA 3 0 RED inputA 3 5 BLUE inputA 4 0 RED inputA 4 5 BLUE net make_input inputA inputA net connect inputA A inputB inputB 0 0 CIRCLE inputB 0 5 SQUARE inputB 1 0 CIRCLE inputB 1 5 SQUARE inputB 2 0 CIRCLE inputB 2 5 SQUARE inputB 3 0 CIRCLE inputB 3 5 SQUARE inputB 4 0 CIRCLE inputB 4 5 SQUARE net make_input inputB inputB net connect inputB B inputE inputE 0 0 ZERO inputE 0 2 CIRCLE inputE 0 35 RED inputE 0 5 ZERO inputE 0 7 SQUARE inputE 0 85 BLUE inputE 1 0 ZERO inputE 1 2 CIRCLE inputE 1 35 RE
161. y extension and run in Nengo by choosing File gt Open from the menu or clicking on the blue Open icon in the upper left All of the examples in the demo directory have been written using this system 3 2 1 Creating a network The first thing you need to do is to create your new network object and tell it to appear in the Nengo user interface import nef net nef Network My Network net add_to_nengo 3 2 2 Creating an ensemble Now we can create ensembles in our network You must specify the name of the ensemble the number of neurons and the number of dimensions net make A neurons 100 dimensions 1 net make B 1000 2 net make C 50 10 You can also specify the radius for the ensemble The neural representation will be optimized to represent values inside a sphere of the specified radius So if you have a 2 dimensional ensemble and you want to be able to represent the value 10 10 you should have a radius of around 15 net make D 100 2 radius 15 3 2 3 Creating an input To create a simple input that has a constant value which can then be controlled interactive mode interface do the following net make_input inputA values 0 The name can anything and the values is an array of the required length So for a 5 dimensional input you can do net make_input inputB values 0 0 0 0 0 or net make_input inputC values 0 5 You can also use value
162. y specifying the inputs we could also have read these values from a file inputs for line in file inputs csv readlines row float x for x in line strip split inputs append row 4 3 2 Outputs When running experiments we often don t want the complete record of every output the network makes over time Instead we re interested in what it is doing at specific points For this case we want to find out what the model s output is for each of the inputs In particular we want its output value just before we change the input to the next one in the list The following code will collect these values and save them to a file called experiments csv class Output nef SimpleNode def termination_save self x dimensions 1 pstc 0 01 step int round self t dt 4 3 Running Experiments in Nengo 113 Nengo Documentation Release 1 3 if stepssteps_per_input steps_per_input 1 f file experiment csv at f write d 31 3f n step steps_per_input x 0 f close 4 3 3 Connecting the Model For this particular example here is a model that simply computes the product of its inputs The inputs to the model are connected to the Input node and the outputs go to the Output node to be saved net nef Network Experiment Example input net add Input input create the input node output net add Output output create the output node net make A 100 2 r

Download Pdf Manuals

image

Related Search

Related Contents

Kohler Discovery Plumbing Product User Manual  X-CAM A10-3H User Manual V2.00 - X  Collecte de lait cru et fabrication de produits laitiers  KX-UT248NE_Getting Started  Utilisation de la base LHS  Operating Instructions - York Survey Supply Centre    Life Fitness lc5500 User's Manual  

Copyright © All rights reserved.
Failed to retrieve file