UCLA SFINZ: Neural Network Simulation Environment
Author | : E. Paik |
Publisher | : |
Total Pages | : 43 |
Release | : 1989 |
Genre | : |
ISBN | : |
Download UCLA SFINZ: Neural Network Simulation Environment Book in PDF, Epub and Kindle
Download Ucla Sfinz Neural Network Simulation Environment full books in PDF, epub, and Kindle. Read online free Ucla Sfinz Neural Network Simulation Environment ebook anywhere anytime directly on your device. Fast Download speed and no annoying ads. We cannot guarantee that every ebooks is available!
Author | : E. Paik |
Publisher | : |
Total Pages | : 43 |
Release | : 1989 |
Genre | : |
ISBN | : |
Author | : Eugene Sam Paik |
Publisher | : |
Total Pages | : 43 |
Release | : 1989 |
Genre | : Computer vision |
ISBN | : |
Author | : Eugene Paik |
Publisher | : |
Total Pages | : 10 |
Release | : 1987 |
Genre | : |
ISBN | : |
Massively parallel computing architectures are of widespread interest because they can significantly reduce the execution time of some computationally intensive algorithms. There are tasks, such as the guidance of an autonomous robot over an unknown terrain, where a system's survival is dependent on real time interactions with its environment. These time constraints force algorithms to be recast in a form that more closely matches, and thereby taking advantage of, the underlying computing architecture. Similarly, neurophysiology has shown that natural systems derive needed real time functionality from massively parallel networks by organizing structural components around functional goals. SFINX (Structure and Function In Neural connections) is a neural network simulation environment that allows researchers to investigate the behavior of various neural structures. It is designed to easily express and simulate the highly regular patterns often found in large networks, but it is also general enough to model parallel systems of arbitrary interconnectivity. This paper compares SFINX to previous neural network simulators and describes its features and overall organization.