The Age of Artificial Intelligence, also known as the Age of Intelligence, the AI Era,[1][2][3][4] or the Cognitive Age,[5][6] is a historical period characterized by the rapid development and widespread integration of artificial intelligence (AI) technologies across various aspects of society, economy, and daily life. It marks the transition from the Information Age to a new era where artificial intelligence and the development of computer systems enable machines to learn, and make intelligent decisions to achieve a set of defined goals.[7][8]
This era is marked by significant advancements in machine learning, data processing, and the application of AI in solving complex problems and automating tasks previously thought to require human intelligence.[7][11]
British neuroscientist Karl Friston's work on the free energy principle is widely seen as foundational to the Age of Artificial Intelligence, providing a theoretical framework for developing AI systems that closely mimic biological intelligence.[12] The concept has gained traction in various fields, including neuroscience and technology.[13] Many specialists place its beginnings in the early 2010s, coinciding with significant breakthroughs in deep learning and the increasing availability of big data, optical networking, and computational power.[14][15]
Artificial intelligence has seen a significant increase in global research activity, business investment, and societal integration within the last decade. Computer scientist Andrew Ng has referred to AI as the "new electricity", drawing a parallel to how electricity transformed industries in the early 20th century, and suggesting that AI will have a similarly pervasive impact across all industries during the Age of Artificial Intelligence.[16]