Jump to content

Draft:Receptron

From Wikipedia, the free encyclopedia


The receptron (short for "reservoir perceptron") is a neuromorphic data processing model Neuromorphic computing that generalizes the traditional Perceptron by incorporating non-linear interactions between inputs [1] [2] [3][4][5]. Unlike classical Perceptron, which rely on linearly independent weights, the receptron leverages complexity in physical substrates[6]—such as the electric conduction properties of nanostructured materials or optical speckle fields—to perform classification tasks. The receptron bridges Unconventional computing and Neural network (biology) principles[7], enabling solutions that do not require the training approaches typical of artificial neural networks based on the perceptron model[8].

Algorithm

[edit]

The receptron is an algorithm for supervised learning of binary classifiers, i.e. a classification algorithm that makes its predictions based on a predictor function combining a set of weights with the feature vector. The mathematical model is based on the sum of inputs with non-linear interactions:

      (1)

where and  are non-linear weight functions depending on the inputs, . Nonlinearity does make the system extremely complex and allowing for the solution of problems not solvable through the simpler rules of a linear system such as the perceptron or McCulloch Pitts neurons, which is based on the sum of linearly independent weights:

    (2)

where are constant real values. A consequence of this simplicity is the limitation to linearly separable functions, which necessitates multi-layer architectures and training algorithms like backpropagation[9].

As in the perceptron case, the summation in Eq. 1 origins the activation of the receptron output through the thresholding process,

(3)

where th is a constant threshold parameter. Equation 3 can be written by using the Heaviside step function.

The weight functions  can be written with a finite number of parameters , simplifying the model representation. One can Taylor-expand  and use the idempotency of Boolean variables  such that  can be written as

(4)

where  are independent parameters that can be seen as the components of a tensor (“weight tensor”) of rank and type .

The sum in Eq. [3] reduces to the perceptron case when off-diagonal terms of vanish. If one considers , one gets:

(5)

in the perceptron case, the vanishing of implies linearity . In the receptron case , meaning that the superposition principle is no longer valid, the latter terms being responsible of the more complex non-linear interaction between the inputs.

Design and Implementations

[edit]

1. Electrical Receptron

[edit]

Substrate: Nanostructured and nanocomposite films (Au, Pt, Zr Au/Zr). These films form disordered networks of nanoparticles with resistive switching and non-linear electrical conduction.

2. Optical Receptron

[edit]

Substrate: Optical speckle fields generated by random interference of light emerging from a disordered medium illuminated by a laser or coherent radiation.

Key Features

[edit]

Physical Substrate Computing: The receptron does not require digital training; instead, it exploits the natural complexity of materials (e.g., nanowire networks, diffractive media) to perform computations.

Non-Linear Separability: Unlike traditional perceptrons, which fail on problems like the XOR function, the receptron can solve such tasks due to its inherent non-linearity.

Training-Free Operation: Classification is achieved through the physical system's response rather than iterative weight adjustments, reducing computational overhead.

References

[edit]
  1. ^ Mirigliano, Matteo; Paroli, Bruno; Martini, Gianluca; Fedrizzi, Marco; Falqui, Andrea; Casu, Alberto; Milani, Paolo (2021-12-01). "A binary classifier based on a reconfigurable dense network of metallic nanojunctions". Neuromorphic Computing and Engineering. 1 (2): 024007. doi:10.1088/2634-4386/ac29c9. ISSN 2634-4386.
  2. ^ Paroli, B.; Martini, G.; Potenza, M. A. C.; Siano, M.; Mirigliano, M.; Milani, P. (2023-09-01). "Solving classification tasks by a receptron based on nonlinear optical speckle fields". Neural Networks. 166: 634–644. doi:10.1016/j.neunet.2023.08.001. ISSN 0893-6080. PMID 37604074. Archived from the original on 2024-04-18. Retrieved 2025-09-03.
  3. ^ Paroli, B.; Borghi, F.; Potenza, M. A. C.; Milani, P. (2025-06-24), The receptron is a nonlinear threshold logic gate with intrinsic multi-dimensional selective capabilities for analog inputs, arXiv:2506.19642
  4. ^ Iyer, Prasad P.; Bhatt, Gaurang R.; Desai, Saaketh; Fuller, Elliot J.; Teeter, Corinne M.; Léonard, François; Vineyard, Craig M. (2025-08-08). "Is Computing with Light All You Need? A Perspective on Codesign for Optical Artificial Intelligence and Scientific Computing". Advanced Intelligent Systems 2500371. doi:10.1002/aisy.202500371. ISSN 2640-4567.
  5. ^ Perez, Jake C.; Shaheen, Sean E. (August 2020). "Neuromorphic-based Boolean and reversible logic circuits from organic electrochemical transistors". MRS Bulletin. 45 (8): 649–654. doi:10.1557/mrs.2020.202. ISSN 0883-7694.
  6. ^ Stieg, Adam Z.; Avizienis, Audrius V.; Sillin, Henry O.; Martin-Olmos, Cristina; Aono, Masakazu; Gimzewski, James K. (2012-01-10). "Emergent Criticality in Complex Turing B-Type Atomic Switch Networks". Advanced Materials. 24 (2): 286–293. Bibcode:2012AdM....24..286S. doi:10.1002/adma.201103053. ISSN 0935-9648.
  7. ^ Frenkel, Charlotte; Bol, David; Indiveri, Giacomo (June 2023). "Bottom-Up and Top-Down Approaches for the Design of Neuromorphic Processing Systems: Tradeoffs and Synergies Between Natural and Artificial Intelligence". Proceedings of the IEEE. 111 (6): 623–652. doi:10.1109/JPROC.2023.3273520. ISSN 0018-9219.
  8. ^ Barrows, Frank; Lin, Jonathan; Caravelli, Francesco; Chialvo, Dante R. (July 2025). "Uncontrolled Learning: Codesign of Neuromorphic Hardware Topology for Neuromorphic Algorithms". Advanced Intelligent Systems. 7 (7) 2400739. doi:10.1002/aisy.202400739. ISSN 2640-4567.
  9. ^ Goh, A.T.C. (January 1995). "Back-propagation neural networks for modeling complex systems". Artificial Intelligence in Engineering. 9 (3): 143–151. doi:10.1016/0954-1810(94)00011-S.