Computational Laboratory for Energy And Nanoscience

University Homepage | Faculty of Science | Faculty of Graduate Studies | Registrar's Office |
Physics Homepage | Graduate Studies in Physics | Department contact information
Map it | City of Oshawa | Regional News | Local Weather | Government of Canada
subglobal4 link | subglobal4 link | subglobal4 link | subglobal4 link | subglobal4 link | subglobal4 link | subglobal4 link
subglobal5 link | subglobal5 link | subglobal5 link | subglobal5 link | subglobal5 link | subglobal5 link | subglobal5 link
subglobal6 link | subglobal6 link | subglobal6 link | subglobal6 link | subglobal6 link | subglobal6 link | subglobal6 link
subglobal7 link | subglobal7 link | subglobal7 link | subglobal7 link | subglobal7 link | subglobal7 link | subglobal7 link
subglobal8 link | subglobal8 link | subglobal8 link | subglobal8 link | subglobal8 link | subglobal8 link | subglobal8 link

Manuscript Summary - Extensive deep neural networks

In this article, we demonstrate a new type of deep neural network which is designed for the purpose of dividing a problem into many parts so that it can be efficiently worked on in parallel. Traditional neural networks have been optimized for operating on data which is scale invarient. Our new extensive neural networks maintain the property of additivity across sub-domains. This is an important property in physical systems. Our new EDNN operate at the same accuracy of previously reported work, but can be distributed over many GPU in parallel, making them ideal for use on massively parallel computing architextures. Additionally, EDNN operate on arbitrary sized inputs, so they can train on small structures, but be used to predict the properties of giant ones. Multi-level EDNN can be used in multi-scale modeling, where the degree of coarseness is interacdtion specific.

UOIT