Matches in SemOpenAlex for { <https://semopenalex.org/work/W2313580145> ?p ?o ?g. }
Showing items 1 to 43 of
43
with 100 items per page.
- W2313580145 abstract "This paper describes the use of artificial neural networks (ANNs) to model cooling energy use in commercial buildings, and compares the attributes of ANNs and least-squares (LS) regression modeling techniques. The neuro-biological roots of ANN models and the fundamentals of the backpropagation algorithm are described. The effects of differing values of model parameters (gain, bias and learning rate) and network architectures (three and five layer networks) on the rate of convergence and prediction accuracy of ANN models are discussed. Finally, the attributes of ANN and least-squares regression models are compared in a case-study example using measured energy use data from a large commercial building. The results draw attention to the importance of parameter selection when using ANN models, and indicate that multiple hidden layers in ANNs appear to be necessary when modeling the non-linear energy use typical of commercial buildings. Introduction The ability to accurately predict the behavior of energy using systems in commercial buildings is increasingly valuable. Predicted energy use can be compared to observed energy use in order to identify operational problems and measure the effectiveness of energy conservation retrofits1. Energy use forecasts can also be incorporated into control procedures which enhance comfort conditions and reduce energy and demand expenses2v3. Although engineering models can estimate building energy use, difficulties inherent in the calibration procedure often limit the accuracy of the resulting predictions. Empirical models, such as LS regression models and ANNs, increase the accuracy and reducing the modeling time required for energy use forecasts. This paper describes the use of ANNs to model cooling energy use in a commercial building and compares the attributes of ANN and LS models. The Neurobioiogical Model and ANNs The human brain is a highly complex organ comprised of some lo1 basic units called neurons. Each neuron is connected to about lo4 other neurons. Because of this highly interconnected nature, the architecture of the brain is referred to as being massively parallel or massively interconnected. Each neuron consists of a soma, dendrites, axons and synapses (Figure 1). The soma is the body of the neuron. Dendrites and axons extend from the soma and branch out like roots. If a neuron receives enough active inputs along its dendrites, it fires and sends a voltage spike down the axons. Axons are connected to other dendrites and somas at synapses. When a neuron fires, chemicals called neurotransmitters are diffused across the synapses. Learning is thought to occur at the synapses where the neurotransmitters and neuroreceivers vary to reinforce good connections and discourage bad connections4. Figure I . Schematic representation of a neuron ANNs attempt to mimic parts of the architecture and functionality of the brain. Neurons are simulated in ANNs as connected nodes. The distributed, parallel processing structure of the brain is simulated by arranging the nodes in layers such that each node is connected to all of the nodes in the adjacent layers. In a manner analogous to the response of a neuron, each node sums the inputs it receives and transmits an output signal to the other nodes to which it is connected. The output signal of each ANN node is multiplied by a weight which is varied during the learning process, just as synaptic neurotransmitters and receivers are varied in the human learning process. The distributed, parallel architecture of the brain is well suited to learning and pattern recognition tasks such as vision, in which several processes and Copyright c 1994 by John K. Kissock. Published by the American Institute of Aeronautics and Astronautics, Inc. with permission. 1290 comparisons are made simultaneously. In contrast, the digital computer is a serial device in which a single central processing unit (CPU) sequentially processes a set of instructions. ANN algorithms simulate the brain's parallel architecture in the serial environment of the digital computer, with the hope of mimicking parts of the brain's amazing capacity for learning and pattern recognition. Generalized Delta, Back-Propagation Algorithm Many different types of ANNs have been devised to accomplish a wide variety of tasks including recognition of handwritten English words, speech recognition and image compression5. The ANNs examined here employ a fully-connected, feedforward architecture (Figure 2). Fully-connected means that each node is connected to all of the nodes in the adjacent layers (or columns of nodes). Feedforward indicates that information is passed in a single direction from the input to the output nodes." @default.
- W2313580145 created "2016-06-24" @default.
- W2313580145 creator A5082954596 @default.
- W2313580145 date "1994-08-07" @default.
- W2313580145 modified "2023-09-25" @default.
- W2313580145 title "Modeling commercial building energy use with artificial neural networks" @default.
- W2313580145 cites W1990017717 @default.
- W2313580145 cites W4210937630 @default.
- W2313580145 cites W4256165609 @default.
- W2313580145 doi "https://doi.org/10.2514/6.1994-4161" @default.
- W2313580145 hasPublicationYear "1994" @default.
- W2313580145 type Work @default.
- W2313580145 sameAs 2313580145 @default.
- W2313580145 citedByCount "6" @default.
- W2313580145 countsByYear W23135801452013 @default.
- W2313580145 countsByYear W23135801452014 @default.
- W2313580145 countsByYear W23135801452016 @default.
- W2313580145 countsByYear W23135801452023 @default.
- W2313580145 crossrefType "proceedings-article" @default.
- W2313580145 hasAuthorship W2313580145A5082954596 @default.
- W2313580145 hasConcept C154945302 @default.
- W2313580145 hasConcept C41008148 @default.
- W2313580145 hasConcept C50644808 @default.
- W2313580145 hasConceptScore W2313580145C154945302 @default.
- W2313580145 hasConceptScore W2313580145C41008148 @default.
- W2313580145 hasConceptScore W2313580145C50644808 @default.
- W2313580145 hasLocation W23135801451 @default.
- W2313580145 hasOpenAccess W2313580145 @default.
- W2313580145 hasPrimaryLocation W23135801451 @default.
- W2313580145 hasRelatedWork W2093578348 @default.
- W2313580145 hasRelatedWork W2159443810 @default.
- W2313580145 hasRelatedWork W2358668433 @default.
- W2313580145 hasRelatedWork W2376932109 @default.
- W2313580145 hasRelatedWork W2386387936 @default.
- W2313580145 hasRelatedWork W2390279801 @default.
- W2313580145 hasRelatedWork W2748952813 @default.
- W2313580145 hasRelatedWork W2899084033 @default.
- W2313580145 hasRelatedWork W644753246 @default.
- W2313580145 hasRelatedWork W1629725936 @default.
- W2313580145 isParatext "false" @default.
- W2313580145 isRetracted "false" @default.
- W2313580145 magId "2313580145" @default.
- W2313580145 workType "article" @default.