We Continue the Work of Those
Who Were the First.

  • Electrotechnics
  • Electrical Engineering
  • Light & Lighting
  • Power Engineering
  • Transportation
  • Automation
  • Communication
  • Smart Buildings
  • Industry
  • Innovation

Current issue

ELEKTRO 6/2019 was released on June 6th 2019. Its digital version will be available on June 24th 2019.

Topic: Rotating electrical machines, drives and power electronics; Electromobility

Main Article
Hybrid drive of shunting locomotive

SVĚTLO (Light) 3/2019 was released on June 11th 2019. Its digital version will be available on July 17th 2019.

Fairs and exhibitions
Euroluce 2019 by designers eyes
Exhibition Light in architecture 2019
Amper 2019 in capture of sophisticated technologies

Refreshing our memory
Lighting glass from Kamenný pahorek

Researchers show glare of energy consumption in the name of deep learning

10.06.2019 | Tech Xplore | www.techxplore.com

Wait, what? Creating an AI can be way worse for the planet than a car? Think carbon footprint. That is what a group at the University of Massachusetts Amherst did. They set out to assess the energy consumption that is needed to train four large neural networks.

Deep learning involves processing very large amounts of data. In order to learn something as complex as language, the models have to be large. What price making models obtain gains in accuracy? Roping in exceptionally large computational resources to do so is the price, causing substantial energy consumption.

Deep learning

Researchers reported their findings, that "the process can emit more than 626,000 pounds of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car (and that includes manufacture of the car itself)."

These models are costly to train and develop—-costly in the financial sense due to the cost of hardware and electricity or cloud compute time, and costly in the environmental sense. The environmental cost is due to the carbon footprint. The paper sought to bring this issue to the attention of NLP researchers "by quantifying the approximate financial and environmental costs of training a variety of recently successful neural network models for NLP."

Read more at Tech Xplore

Image Credit: Pixabay

-jk-