{"title": "Basins of Attraction for Electronic Neural Networks", "book": "Neural Information Processing Systems", "page_first": 524, "page_last": 533, "abstract": null, "full_text": "524 \n\nBASINS OF ATTRACTION FOR \nELECTRONIC NEURAL NETWORKS \n\nC. M. Marcus \n\nR. M. Westervelt \n\nDivision of Applied Sciences and Department of Physics \n\nHarvard University, Cambridge, MA 02138 \n\nABSTRACT \n\nWe have studied the basins of attraction for fixed point and \noscillatory attractors in an electronic analog neural network. Basin \nmeasurement circuitry periodically opens the network feedback loop, \nloads raster-scanned initial conditions and examines the resulting \nattractor. Plotting the basins for fixed points (memories), we show \nthat overloading an associative memory network leads to irregular \nbasin shapes. The network also includes analog time delay circuitry, \nand we have shown that delay in symmetric networks can introduce \nbasins for oscillatory attractors. Conditions leading to oscillation \nare related to the presence of frustration; reducing frustration by \ndiluting the connections can stabilize a delay network. \n\n(1) -\n\nINTRODUCTION \n\nThe dynamical system formed from an interconnected network of \nnonlinear neuron-like elements can perform useful parallel \ncomputation l - 5 . \nRecent progress in controlling the dynamics has \nfocussed on algorithms for encoding the location of fixed pOints 1 ,4 \nand on the stability of the flow to fixed points 3 \u2022 5-8. An equally \nimportant aspect of the dynamics is the structure of the basins of \nattraction, which describe the location of all pOints in initial \ncondition space which flow to a particular attractor 10 . 22 . \n\nIn a useful associative memory, an initial state should lead \nreliably to the \"closest\" memory. This requirement suggests that a \nwell-behaved basin of attraction should evenly surround its attractor \nand have a smooth and regular shape. One dimensional basin maps \nplotting \"pull in\" probability against Hamming distance from an \nattract or do not reveal \nthe basin in the high \ndimensional space of initial states 9. 19 . Recently, a numerical study \nof a Hopfield network with discrete time and two-state neurons showed \nrough and irregular basin shapes in a two dimensional Hamming space, \nsuggesting that \nthe high dimensional basin has a complicated \nstructure 10 . It is not known how the basin shapes change with the \nsize of the network and the connection rule. \n\nthe shape of \n\nWe have investigated the basins of attraction in a network with \ncontinuous state dynamics by building an electronic neural network \nwith eight variable gain sigmoid neurons and a three level (+,0,-) \ninterconnection matrix. We have also built circuitry that can map \nout the basins of attraction in two dimensional slices of initial \nstate space \nThe network and the basin measurements are \ndescribed in section 2. \n\n(Fig .1) . \n\n@ American Institute of Physics 1988 \n\n\f525 \n\nIn section 3, we \n\nshow that the network operates well as an \nassociative memory and can retrieve up to four memories (eight fixed \npoints) without developing spurious attractors, but that for storage \nof three or more memories, the basin shapes become irregular. \n\nIn section 4, we consider the effects of time delay. Real network \ncomponents cannot switch infinitely fast or propagate signals \ninstantaneously, so that delay is an intrinsic part of any hardware \nimplementation of a neural network. We have included a controllable \nCCD \n(charge coupled device) analog time delay in each neuron to \ninvestigate how time delay affects the dynamics of a neural network. \nWe find that networks with symmetric interconnection matrices, which \nare guaranteed to converge to fixed points for no delay, \nshow \ncollective sustained oscillations when time delay is present. By \ndiscovering which configurations are maximally unstable \nto \noscillation, and \nlooking at how these configurations appear in \nnetworks, we are able to show that by diluting the interconnection \nmatrix, one can reduce or eliminate the oscillations in neural \nnetworks with time delay. \n\n(2) - NETWORK AND BASIN MEASUREMENT \n\nA block diagram of the network and basin measurement circuit is \n\nshown in fig.1. \n\ncomparator \n\ndigital \nand \n\noscillation \ndetector \n\n\u00a5c \n\nsigmoid amplifiers \n\nwith \n\ndesired \nmemory \n\nFi~.l Block diagram \nof the network and \nbasin measurement \nsystem. \n\nfeedback \n\nThe main \n\nloop consists of non-linear amplifiers \n(\"neurons\", see fig.2) with capacitive inputs and a resistor matrix \nallowing interconnection strengths of -l/R, 0, +l/R (R = 100 kn). In \nall basin measurements, the input capacitance was 10 nF, giving a \ntime constant of 1 ms. A charge coupled device \n(CCD) analog time \ndel ay l l was built into each neuron, providing an adjustable delay per \nneuron over a range 0.4 - 8 ms. \n\n\f526 \n\n50k \n\n1/ \n\nInverting \noutput \n\nFig.2 Electronic neuron. \nNon-linear gain provided \nby feedback diodes. \nInset: Nonlinear \nbehavior at several \ndifferent values of \ngain. \n\nAnalog switches allow the feedback path to be periodically \ndisconnected and each neuron input charged to an initial voltage. The \nnetwork is then reconnected and settles to the attractor associated \nwith that set of initial conditions. Two of the initial voltages are \nraster scanned (on a time scale that is long compared to the load/run \nswitching time) with function generators that are also connected to \nthe X and Y axes of a storage scope. \nThe beam of the scope is \nactivated when \nthe network settles into a de sired at tractor, \nproducing an \ndimensional slice of initial condition space. \nThe \"attractor of \ninterest\" can be one of the 2 8 fixed points or an oscillatory \nattractor. \n\nimage of the basin for that attractor in a \n\ntwo(cid:173)\n\nA simple example of this technique is the case of three neurons \n\nwith symmetric non-inverting connection shown in fig.3. \n\n\" \n\", \n\n\" \n\", \n.... \n\" \n\n3-D \nSTATE \nSPACE \u2022 \n. . \n\n1V \n\n-1 V \n\n-1 V \n\n1V \n\nwith \n\n-1.0V \n\nG\u00a7\u00a7gBASIN FORt t t Fig.3 Basin \n_ \n\nof \nBASIN FOR ~ \u2022\u2022 attraction for three \nneurons \nsym-\nmetric non-inverting \ncoupling. Slices are \nin \nthe plane of \ninitial voltages on \nneurons 1 and 2. The \ntwo fixed points are \nall neurons saturated \nall \nor \nThe data \n.~. are photographs of \n\nj~ positive \n\nnegative. \n\nCIRCUIT: \n\n\u2022 \n\n+ \n\nthe scope screen. \n\n(3) BASINS FOR FIXED POINTS - ASSOCIATIVE MEMORY \n\nTwo dimensional slices of the eight dimensional initial condition \nspace (for the full network) reveal important qualitative features \nabout the high dimensional basins. Fig. 4 shows a typical slice for \na network programmed with three memories according to a clipped Hebb \nrule1, 12: \n\n\fwhere ~ is an N-component memory vector of l's and -l's, and m is \nthe number of memories. The memories were chosen to be orthogonal \n(~a .~~ = N 8a~) . \n\n527 \n\n(1 ) \n\n1V- r - - - r - - - - - - - - . , \n\n1, \u00b71, \n1,\u00b71, \n1,\u00b71, \n1,\u00b71 \n\n\\ \n\n0-\n\n_\\ \n\n'1,1,\u00b71,\u00b71, \n1,1,1,1 \n\n1,1, \u00b71,\u00b71, \n1,1,\u00b71,\u00b71 \n\n1,1,1,1, \n\u00b71,\u00b71,\u00b71,\u00b71 \n\n-1,1,-1,1, \n\u00b71,1,-1,1 \n\nMEMORIES: \n1,1,1,1,-1,-1,-1,-1 \n1 ,-1,1 ,-1,1 ,-1,1 ,-1 \n1,1,-1,-1,1 ,1,-1,-1 \n\n.-\nL.:.:.~ __ ~~~~_\"\"'_ -IV \n\n.w._ \u2022. w ..... \"\" .. -1 V- \"\"'=====:;:===~ \nI \nIV \n\n\u2022 \n0 \n\n~ \n\nFi~. 4 A slice of initial condition space shows the basins of \nattraction for five of the six fixed points for three memories \nin eight-neuron Hopfield net. Learning rule was clipped Hebb \n(Eq.1). Neuron gain = 15. \n\nBecause the Hebb rule (eq. 1) makes ~a and _~a stable attractors, a \nthree-memory network will have six fixed point attractors. In fig.4, \nthe basins for five of these attractors are visible, each produced \nwith a different rastering pattern to make it distinctive. Several \ncharacteristic features should be noted: \n\n-- All initial conditions lead to one of the memories \n\n(or \ninverses), no spurious attractors were seen for three or four \nmemories. \nThis is interesting in light of the well documented \nemergence of spurious attractors at miN -15% in larger networks with \ndiscrete time 2,lS. \n\n-- The basins have smooth and continuous edges. \n-- The shapes of the basins as seen in this slice are irregular. \nIdeally, a slice with attractors at each of the corners should have \nrectangular basins, one basin in each quadrant of the slice and the \nlocation of the lines dividing quadrants determined by the initial \nconditions on the other neurons (the \"unseen\" dimensions). With three \nor more memories the actual basins do not resemble this ideal form. \n\n(4) TIME DELAY, FRUSTRATION AND SUSTAINED OSCILLATION \n\nArguments defining conditions which guarantee convergence to \n(based, for example, on the construction of a \nfixed points 3 , 5,6 \ninstantaneous communication \nLiapunov function) generally assume \nbetween elements of the network. \nIn any hardware implementation, \nthese assumptions break down due to the finite switching speed of \namplifiers and the charging time of long interconnect lines. 13 It is \nthe ratio of delay/RC which is important for stability, so keeping \nthis ratio small limits how fast a neural network chip can be \ndesigned to run. \nTime delay is also relevant to biological neural \nnets where propagation and response times are comparable. 14 ,lS \n\n\f528 \n\nOur particular interest in this section is how time delay can \nlead to sustained oscillation in networks which are known to be \nstable when there is no delay. We therefore restrict our attention \nto networks with symmetric interconnection matrices (Tlj = Tjl)' \n\nAn obvious ingredient in producing oscillations in a delay \nnetwork is feedback, or stated another way, a graph representing the \nconnections in a network must contain loops. \n\nThe simplest oscillatory structure made of delay elements is the \nring oscillator (fig.Sa). Though not a symmetric configuration, the \nthe ring will \nring oscillator illustrates an \noscillate only when there is negative feedback at dc -\nthat is, when \nthe product of'interconnection around the loop is negative. Positive \n(loop product of connections > 0) will lead to \nfeedback at dc \nsaturation. \n\nimportant point: \n\nfind \n\nObserving various symmetric configurations (e.g. fig.Sb) in the \nthat a negative product of \ndelayed-neuron network, we \nconnections around a loop is also a necessary condition for sustained \noscillation in symmetric circuits. An important difference between \nthe ring (fig.Sa) and the symmetric loop (fig.Sb) is that the period \nof oscillation for the ring is the total accumulated delay around the \nring -\nthe larger the ring the longer the period. In contrast, for \nthose symmetric configurations which have oscillatory attractors, the \nperiod of oscillation is roughly twice the delay, regardless of the \nsize of the configuration or the value of delay. This indicates that \nfor symmetric configurations the important feedback path is local, \nnot around the loop. \n\n(NEGATIVE \n\n\u2022\u2022\u2022 ,............ FEEDBACK) \n\nI\u00b7\\:~~~illator \n/ ; \\ \\ ~bJmmetric \n1/ \\ \\ (FI~~~TRATED) \n................ \n\notl ~ \u2022\u2022\u2022\u2022\u2022. h.\", \n\n\u2022 \n\n=lime delay \n\nneuron \n\n/ ' =non-inverting \n\nconnection \n,,:1' .. inverting \nconnection \n\n.\u2022 ' \n\n(b) Symmetrical(cid:173)\n\nFir;;r . S \n(a) A ring oscillator: \nneeds negative feedback at dc \nto oscillate. \nly connected triangle. This \nconfiguration is \"frustrated\" \n(defined in text), and has \nboth oscillatory and \nfixed \npoint attractors when neurons \nhave delay . \n\nConfigurations with loop connection product < 0 are important in \nthe theory of spin glasses 16 , where such configurations are called \n\"frustrated.\" Frustration in magnetic (spin) systems, gives a measure \nof \"serious\" bond disorder (disorder that cannot be removed by a \nchange of variables) which can lead to a spin glass state. 16.17 \nRecent results based on the similarity between spin glasses and \nsymmetric neural networks has shown that storage capacity limitations \ncan be understood in terms of this bond disorder. 18 ,19 Restating our \nobservation above: We only find stable oscillatory modes in symmetric \nnetworks with delay when there is frustration. A similar result for a \nsign-symmetric network \n(Tlj, Tjl both ~o or $0) with no delay is \ndescribed by Hirsch. 6 \n\nWe can set up the basin measurement system (fig.l) to plot the \nbasin of attraction for the oscillatory mode. Fig.6 shows a slice of \nthe oscillatory basin for a frustrated triangle of delay neurons. \n\n\f529 \n\n1.5V \n\n. . . \n\n. \n\n\u2022\n\n. . \n\n. \n.. : ... f .... \n\n,'.,\" \n,:, \nO\n\n' \n\n'.'.' \n\n\" \n\n.' 'I \n\no \n\n, \n\n. l , , \u00b7 'I\n., \n\n\u00b71.5V \n\n'i \\ \" \n, \n\"\n\n. ' \n\n. \n\n.: \n\n:,1 \n, \n. \n; \n' \n\nI \n\n\u00b71.5V \n\nI o \n\nI \n\n1.5V \n\nFig.6 Basin for oscillatory attractor (cross-hatched region) \nin frustrated triangle of delay-neurons. Connections were \nall symmetric and inverting; other frustrated configurations \n(e.g. two non-inverting, one inverting, all symmetric) were \nsimilar. (6a): delay = O.48RC, inset shows trajectory to fixed \npoint and oscillatory mode for \ntwo close-lying initial \nconditions. \n\n(6b): delay = O. 61RC, basin size increases. \n\nA fully connected feedback associative network with more that one \nthe \nmemory will contain frustration. As more memories are added, \namount of frustration will \nincreases until memory retrieval \ndisappears. But before this point of memory saturation is reached, \ndelay could cause an oscillatory basin to open. \nIn order to design \nout this possibility, one must understand how frustration, delay and \nglobal stability are related. A first step in determining the \nstability of \nsmall \nconfigurations are most prone to oscillation, and then see how these \n\"dangerous\" configurations show up in the network. As described \nabove, we only need to consider frustrated configurations. \n\nto consider which \n\na delay network \n\nis \n\nforming what \n\nis called in graph \n\nA frustrated configuration of neurons can be sparsely connected, \nas in a loop, or densely connected, with all neurons connected to all \nothers, \ntheory a \"clique.\" \nRepresenting a network with inverting and non-inverting connections \nas a signed graph (edges carry + and -), we define a frustrated clique \nas a fully connected set of vertices (r vertices,' \nr (r-l) /2 edges) \nwith all sets of three vertices in the clique forming frustrated \ntriangles. Some examples of frustrated loops and cliques are shown in \nfig. 7. Notice that neurons connected with all inverting symmetric \nconnections, a configuration that is useful as \na \"winner-take-all\" \ncircuit, is a frustrated clique. \n\n<> \\\u00b7\u00b7\u00b7\u00b7\u00b7\u00b7\u00b71 \\! ..... \u00b7\u00b7\u00b7\u00b7\u00b7~FRUSTRATED \n\n. . . ........ \n\n,/ \\ ! \n.' \n\n! LOOPS \n\nIn \n\nand \n\nFiQ.7 Examples of frustrated \nloops \nfrustrated \ncliques. \nthe graph \nrepresentation vertices \n(black dots) are neurons \n(with delay) and undirected \nedges \nare \nconnections. \n\nsymmetric \n\n,~ \n\\ \n\n/ \n,.. \n, \n\u2022...............\u2022 \n\n.... =inverting \n\n/\",non-inverting \n\n..\u2022.. \n\nsymmetric connection \n\nsymmetric connection \n.. \n.~~----------------~--------------~ \nFRUSTRATED \nCLIQUES \n\n. ::::.j..\\.>. \n\\\">'\" ,.:(/ ~~);;.::~:. (fully connected; \n: ! \u2022..\u2022... ~: \n... ,' \n'.\" \n\u2022...........\u2022 \n\n~; \n\nall triangles \nfrustrated) \n\n/i~;~/' \n\n;;.!: .1'''. : \n;.~~r.~ ,'-\n'\" \n.. ' \n\n................ \n\n: \n\n..' \n\n~:\" \n\n\f530 \n\nWe find that delayed neurons connected in a frustrated loop \nlonger than three neurons do not show sustained oscillation for any \nvalue of delay (tested up to delay = 8RC). \nIn contrast, when delayed \nneurons are connected in any frustrated clique configuration, we do \nfind basins of attraction for sustained oscillation as well as fixed \npoint attractors, and that the larger the frustrated clique, the more \neasily it oscillates in the following ways: \n(1) For a given value of \ndelay/RC, the size of the oscillatory basin increases with r, the \nsize of the frustrated clique (fig. 8). \n(2) The critical value of \ndelay at which the volume of the oscillatory basin goes to zero \ndecreases with increasing r \n(fig.9); For r=8 the critical delay is \nalready less than 1/30 RC. \n\n/\\ \n\n\u2022..............\u2022 \n\n1.1\\ \n\n.: .............. =-\n\nFig.8 \nSize of basin \nfor oscillatory mode \nincreases with size of \nfrustrated clique. \nThe \ndelay \nis 0.46RC per \nneuron in each picture. \nSlices are in the space \nof initial voltages on \nneurons 1 and 2, other \ninitial voltages near \nzero. \n\n1 \n\niG \nu \n\n. -~ u -4: \n\n0:-\n....... \n>-co \n\nQ) \n\n\"0 - .1 \n\n\u2022 \n\n\u2022 \n\n\u2022 \u2022 \u2022\u2022 \n\n1 size of frustrated clique (r) \n\n10 \n\nFig.9 The critical valu,? of delay \nwhere the oscillatory mode vanishes . \nMeasured by reducing delay until \nsystem leaves oscillatory attractor . \nDelay plotted \nthe \ncharacteristic time RioC, where Rio \n= (Lj 1 /Rij) -1=10Sn/ (r-1) and C=10nF, \nindicating that the critical delay \ndecreases faster than 1/(r-1). \n\nin units of \n\nHaving identified frustrated cliques as the maximally unstable \nconfiguration of time delay neurons, we now ask how many cliques of a \ngiven size do we expect to find in a large network. \n\nA set of r vertices (neurons) can be fully connected by r(r-1)/2 \n(+ or -) to form 2 r (r-1)/2 different cliques. Of \nedges of two types \nthese, 2 (r-1) will be frustrated cliques. Fig .\u00b710 shows all 2 (4-1) =8 \ncases for r=4. \n\n~ A \n\n: \ni \n. \n\n,':11 \nII \n\" \n: \n\" ,', II \n~ ............... ~ .---. ............... \nr.III' \n1'1',1 \nEig.10 All graphs of size r=4 that are frustrated cliques \n(fully connected, every triangle frustrated.) Solid lines = \npositive edges, dashed lines = negative edges. \n\n.-------'. \n\n\f531 \n\nFor a \n\nrandomly connected network, \n\nthis result combined with \nrandom graph th eory 20 gives an expected number of \n\nresults from \nfrustrated cliques of size r in a network of size N, EN(r): \n\nEN(r) = (r) c(r,p) \n\nN \n\nc(r,p) = r(r-l) (r-2)/2 pr(r-l)/2 \n\n(2) \n\n(3) \n\nwhere (r) is the binomial coefficient and c(r,p) is defined as the \n\nN \n\nconcentration of frustrated cliques. p is the connectance of the \nnetwork, defined as \ntwo neurons are \nconnected. Eq.3 is the special case where + and \ninverting, inverting connections) are equally probable. We have also \ngeneralized this result to the case p(+)~p(-). \n\nthe probability that any \n\nedges \n\n(non(cid:173)\n\n-\n\nFig.11 shows the dramatic reduction in the concentration of all \nfrustrated configurations in a diluted random network. For the \ngeneral case (p(+)~p(-\u00bb we find that the negative connections \naffect the concentrations of frustrated cliques more strongly than \nthe positive connections, \n(Frustration requires \nnegatives, not positives, see fig. 10) . \n\nas expected \n\n10\u00b0y---------~------~--~--r__r~~~_, \n\nFig.11 Concentration of \nfrustrated cliques of size \nr=3,4,S,6 in an unbiased \nrandom network, \nfrom eq.3. \nConcentrations decrease \nrapidly as the network is \nfor \ndiluted, especially \nlarge cliques \n(note: \nlog \nscale) . \n\nconnectance (p) \n\n1 \n\nWhen \n\nthe interconnections in a network are specified by a \nlearning rule rather than at random, \nthe expected numbers of any \nconfiguration will differ from the above results. \nWe have compared \nthe number of frustrated triangles in large three-valued (+1,0,-1) \nHebb interconnection matrices (N=100,300,600) to the expected number \nin a random matrix of the same size and connectance. The Hebb matrix \nwas constructed according to the rule: \n\nTij = Zk (La=l,m ~ia ~ja) ; Tii = \u00b0 \n\nZk(X) = +1 for x > k; 0 for -k $x $k; -1 for x < -k; \n\n(4a) \n(4b) \n\nm is the number of memories, Zkis a threshold function with cutoff \nk, and ~a is a random string of l's and -l's. The matrix constructed \nby eq.4 is roughly unbiased (equal number of positive and negative \nconnections) and has a connectance p(k). Fig.12 shows the ratio of \nfrustrated triangles in a diluted Hebb matrix to the expected number \nin a random graph with the same connectance for different numbers of \n\n\f532 \n\nmemories stored in the Hebb matrix. At all values of connectance, the \nHebb matrix has fewer frustrated triangles than the random matrix by \na ratio that is decreased by diluting the matrix or storing fewer \nmemories. The curves do not seem to depend on the size of the matrix, \nN. \nThis result suggests that diluting a Hebb matrix breaks up \nfrustration even more efficiently than diluting a random matrix. \n\nN .. 300 \n\n.\" \nCD \n~ \nI'l! \n\n\u2022 ratio 1TI=o15 \n0.9 \u2022 ratio m-25 \n\u00b7c 0.7 -\na \n\u2022 ratio maS5 \n\u2022 ratio 1TI=o100 \n\nratio m-40 \n\n\"C \n\n~ \n~ 0.5 \n::l .:: \n'0 0.3 \n.2 \nT\u00a7 \n\n0.1 \n\n. 1 \n\nconnectance \n\nin a \ngraph with \n\nFiQ'.12 The number of frustrated \ntriangles in a (+,0,-) Hebb rule \n(300x300) divided by the \nmatrix \nrandom \nexpected number \nequal \nsigned \nThe different sets \nconnectance. \nfor different \nof points are \nnumbers of random memories in the \nlines are \nHebb matrix. \nguides to the eye. \n\nThe \n\nThe sensitive dependence of frustration on connectance suggests \nthat oscillatory modes in a large neural network with delay can be \neliminated by diluting the interconnection matrix. As an example, \nconsider a unbiased random network with delay = RC/10. \nFrom fig.9, \nonly frustrated cliques of size r=5 or larger have oscillatory basins \nfor this value of delay; frustration in smaller configurations in the \nnetwork cannot \nthe network. \nDiluting the connectance to 60% will reduce the concentration of \nfrustrated cliques with r=5 by a factor of over 100 and r=6 by a \nfactor of 2000. \nThe reduction would be even greater for a clipped \nHebb matrix. \n\nlead to sustained oscillation in \n\nResults from spin glass theory 21 suggest that diluting a clipped \nHebb matrix can actually improve the storage capacity for moderated \ndilution, with a maximum in the capacity at a connectance of 61%. To \nthe extent this treatment applies to an analog continuous-time \nnetwork, we should expect that by diluting connections, oscillatory \nmodes can be killed before memory capacity is compromised. \n\nWe have confirmed the stabilizing effect of dilution in our \nnetwork: For a fully connected eight neuron network programmed with \nthree orthogonal memories according to eq.l, adding a delay of 0.4RC \nopens large basins for sustained oscillation. By randomly diluting \nthe \nthe \noscillatory basins and recover a useful associative memory. \n\np ...... 0.85, we were able to close \n\ninterconnections \n\nto \n\nSUMMARY \n\nWe have investigated the structure of fixed point and oscillatory \nbasins of attraction in an electronic network of eight non-linear \namplifiers with controllable time delay and a three value (+,0,-) \ninterconnection matrix. \n\nFor fixed point attractors, we find that the network performs \nwell as an associative memory - no spurious attractors were seen for \nup to four stored memories - but for three or more memories, \nthe \nshapes of the basins of attraction became irregular. \n\n\f533 \n\nis \n\nfrustration. Frustrated cliques \n\nA network which is stable with no delay can have basins for \noscillatory at tractors when time delay is present. For symmetric \nnetworks with time delay, we only observe sustained oscillation when \nthere \n(fully connected \nconfigurations with all triangles frustrated), and not loops, are \nmost prone to oscillation, and the larger the frustrated clique, the \nmore easily it oscillates. The number of \nthe se \"dangerous\" \nconfigurations in a large network can be greatly reduced by diluting \nthe connections. We have demonstrated that a network with a large \nbasin for an oscillatory attractor can be stabilized by dilution. \n\nACKNOWLEDGEMENTS \n\nWe thank K.L.Babcock, S.W.Teitsworth, S.Strogatz and P.Horowitz for \nuseful discussions. One of us (C.M.M) acknowledges support as an AT&T \nBell Laboratories Scholar. This work was supported by JSEP contract \nno. N00014-84-K-0465. \n\nREFERENCES \n\n1) \n2) \n3) \n4) \n\nJ.S.Denker, Physica 22ll, 216 (1986). \nJ.J. Hopfield, Proc.Nat.Acad.Sci. ~, 2554 (1982). \nJ.J. Hopfield, Proc.Nat.Acad.Sci. al, 3008 (1984). \nJ.S. Denker, Ed. Neural Networks for Computing, AlP Conf. Proc. \n\nl.5..l. (1986). \n\n5) M.A. Cohen, S. Grossberg, IEEE Trans. SMC-13, 815 (1983). \n6) M.W.Hirsch, Convergence in Neural Nets, IEEE Conf.on Neural \n\nNetworks, 1987. \n\n7) K.L. Babcock, R.M. Westervelt, Physica 2Jll,464 (1986). \n8) K.L. Babcock, R.M. Westervelt, Physica zau,305 (1987). \n9) \n\nSee, for example: D.B.Schwartz, et aI, Appl.Phys.Lett.,~ (16), \n\n1110 (1987); or M.A.Silviotti,et aI, in Ref.4, pg.408. \n\n10) J.D. Keeler in Ref.4, pg.259. \n11) CCD analog delay: EG&G Reticon RD5106A. \n12) D.O.Hebb, The Organization of Behavior, (J.Wiley, N.Y., 1949). \n13) Delay in VLSI discussed in: A. Muhkerjee, Introduction to nMOS \n\nand CMOS VLSI System Design, (Prentice Hall, N.J.,1985). \n\n14) U. an der Heiden, J.Math.Biology, ~, 345 (1979). \n15) M.C. Mackey, U. an der Heiden, J.Math.Biology,~, 221 (1984). \n16) Theory of spin glasses reviewed in: K. Binder, A.P. Young, \n\nRev. Mod. Phys. ,.5,a (4),801, (1986). \n\n17) E. Fradkin,B.A. Huberman,S.H. Shenker, Phys.Rev.lila (9),4789 \n\n(1978) . \n\n18) D.J. Amit, H. Gutfreund, H. Sompolinski, Ann.Phys. ~, 30, \n\n(1987) and references therein. \n\n19) J.L. van Hemmen, I. Morgenstern, Editors, Heidelberg Colloquium \n\non Glassy Dynamics, Lecture Notes in Physics~, (Springer-\nVerlag, Heidelberg, 1987). \n\n20) P. Erdos, A. Renyi, Pub. Math. Inst. Hung .Acad. Sci., .5.,17, \n21) \n22) J. Guckenheimer, P.Holmes, Nonlinear Oscillations,Dynamical \n\nI.Morgenstern in Ref.19, pg.399;H.Sompolinski in Ref.19, pg.485. \n\n(1960). \n\nSystems and Bifurcations of Vector Fields (Springer,N.Y.1983). \n\n\f", "award": [], "sourceid": 29, "authors": [{"given_name": "Charles", "family_name": "Marcus", "institution": null}, {"given_name": "R.", "family_name": "Westervelt", "institution": null}]}