2022 was a transformative year for technological innovation and digital transformation. The trend will continue as the pace of innovation and development of potentially disruptive emerging technologies exponentially increases every year. The question arises, what lies ahead for tech for us to learn and experience in 2023?
While there are many impactful tech topics such as the Internet of Things, 5G, Space, Genomics, Synthetic Biology, Automation, Augmented Reality, and others, there are four tech areas to keep a keen watch on this coming year as they have promising and near-term capabilities to transform lives. They include: 1) artificial intelligence, 2) computing technologies, 3) robotics, and 4) materials science.
Artificial Intelligence (AI)
Since the HAL2000 computer and producer Stanley Kubrick provided a glimpse of AI’s independent (although nefarious) capability to independently think in the epic movie, 2001 A Space Odessey, we have been eagerly waiting for the emergence of artificial intelligence. We are now on the cusp of AI emergence and AI is no longer just found in science fiction movies. Elements that compose AI consist of machine learning, and natural language processing and are now a daily part of our lives. Today, AI can understand, diagnose, and solve problems — in some cases without being specifically programmed.
The focus and challenges of artificial intelligence are clear cut. AI systems seek to replicate human traits and computational capabilities in a machine, and surpass human limitations and speed. It is already happening. Artificial synapses that mimic the human brain will likely direct the next generation of computing. The components may differ, it may be analog or digital, and it may be based on transistors, chemicals, biological, photonics, or possibly quantum components.
Computers with AI have been predominantly designed for automation activities that include memory emulation, speech recognition, learning, planning, and problem solving. AI technologies can provide for more efficient decision making by prioritizing and acting on data, especially across larger networks with many users and variables. In the very near future, AI is going to change how we do business, how we plan, and how we design. You can see it now. AI already is a catalyst for driving fundamental changes in many industries such as customer service, marketing, banking, healthcare, business accounting, public safety, retail, education, and public transport.
Recently, a chat box called OpenGPT has brought attention to the potential of AI and its human-like correlations, especially when expressing itself in written analysis. DALL-E, another OpenAI application, has shown the ability to that could create images from basic instructions. Both AI tools do so by mimicking human speech patterns and language and synthesizing data. A good overview of OpenGPT can be found in the recent FORBES article by Arianna Johnson: Here’s What To Know About OpenAI’s ChatGPT—What It’s Disrupting And How To Use It (forbes.com)
Last year, Google’s DeepMind AI division built machines that can predict millions of protein structures, a great benefit to science and health research. In a new breakthrough, DeepMind researchers have created an AI that can now write code as well as humans. The notion of AI writing its own code, creating its own languages is both intriguing and potentially alarming. AI is not quite sentient but may be on track to be. DeepMind Builds AI That Codes as Well as the Average Human Programmer – ExtremeTech
Another very exciting area of potential breakthrough for AI is around Human/computer interface that will extend human brain capacity and memory. Science is already making great advances in brain/computer interface. This may include neuromorphic chips and brain mapping. Brain-computer interfaces are formed via emerging assistive devices that have implantable sensors that record electrical signals in the brain and use those signals to drive external devices.
A brain -computer interface has been shown to even be able to read thoughts. This is done where an electrode plate called an ECOG is put in direct contact with the brain’s surface to measure electrical activity. Paralyzed humans via ECOG can now communicate with others via their thoughts being translated into text, according to Dr. Brian Brown (professor, Icahn School of Medicine at Mount Sinai). Can Technology Make Humans ‘Super’? – Innovation & Tech Today (innotechtoday.com)
A Frontiers in Science publication involving the collaboration of academia, institutes, and scientists summed up the promise of the human computer interface, They concluded that “We can imagine the possibilities of what may come next with the human brain machine interface. A human B/CI system mediated by neural nanorobotics could empower individuals with instantaneous access to all cumulative human knowledge available in the cloud and significantly improve human learning capacities and intelligence. Further, it might transition totally immersive virtual and augmented realities to unprecedented levels, allowing for more meaningful experiences and fuller/richer expression for, and between, users. These enhancements may assist humanity to adapt emergent artificial intelligence systems as human-augmentation technologies, facilitating the mitigation of new challenges to the human species.” Frontiers | Human Brain/Cloud Interface (frontiersin.org)
And with the emergence of all technologies comes the fusion of how they might work together. Artificial intelligence is no doubt one of the primary catalysts involved in enhancing capabilities, especially in computing. For more on this topic of fusion, please see my FORBES article: The New Techno-Fusion: The Merging Of Technologies Impacting Our Future The New Techno-Fusion: The Merging Of Technologies Impacting Our Future (forbes.com)
2) Cognitive Computing Technologies
The world of computing has witnessed seismic advancements since the invention of the electronic calculator in the 1960s. The past few years in information processing have been especially transformational in our hyper-connected world. Futurist Ray Kurzweil said that mankind will be able to “expand the scope of our intelligence a billion-fold” and that “the power of computing doubles, on average, every two years. Recent breakthroughs in physics, nanotechnologies, and have brought us into a cognitive computing reality that we could not have imagined a decade ago.
Biological computing is the advanced science of using biological products to perform actions that would traditionally be done using components like copper wire and fiber glass. Common biological components used in these studies include amino acids and DNA. Computational functions can be performed by manipulating natural chemical reactions found in these substances. What is Biological Computing? (computerhope.com)
In the future biocomputers may be able to be stored on the DNA of living cells. This technology could store almost unlimited amounts of data and allow the biocomputers to perform complex calculations beyond our current capabilities.
Recently, researchers at the Technion In Israel created a biological computer, constructed within a bacterial cell and capable of monitoring different substances in the environment. Currently, the computer identifies and reports on toxic and other materials. “We built a kind of biological computer in the living cells. In this computer, as in regular computers, circuits carry out complicated calculations,” said Barger. “Only here, these circuits are genetic, not electronic, and information are carried by proteins and not electrons.”
It was also reported that researchers at the National Institute of Standards and Technology (NIST) may have developed long-lived biological computers that could potentially persist inside cells. They used nucleic acid RNA to build computers. In explaining the difference between classical computing and biological computing, Samuel Schaffter, NIST postdoctoral researcher stated that “the difference is, instead of coding with ones and zeroes, you write strings of A, T, C and G, which are the four chemical bases that make up DNA.” Revamped Design Could Take Powerful Biological Computers From the Test Tube to the Cell | NIST
Photonic & Optical Computing: Photonic computing uses optical light pulses rather than electrical transistors to form logic gates for computer processing. Researchers at Aalto University have been developing light-based optical logic gates to meet the data processing and transfer demands of next-generation computing. Their new optical chirality logic gates can operate ultrafast processing speeds – about a million times faster than existing technologies. Certainly, this is a computing area to watch. One Million Times Faster Than Current Technology: New Optical Computing Approach Offers Ultrafast Processing (scitechdaily.com)
Chemical computing: Another unconventional approach to computer processing is chemical computing. The ability of chemical systems to compute by acting as logic gates exists in nature. “We are already using chemical computers, because our brains and bodies employ communication via the diffusion of mediators, neuromodulators, hormones, etc.,” says computer scientist Andrew Adamatzky, director of the International Center of Unconventional Computing at the University of the West of England in Bristol. “We are chemical computers,” he summarizes.” Chemical Computing, the Future of Artificial Intelligence | OpenMind (bbvaopenmind.com)
Quantum Computing: Civilization is now at the footstep of quantum computing. Quantum computing will be able to provide unprecedented computational speed with predictive analytics to solve problems. Quantum technology, that uses the unique characterizations of sub-atomic particles to process data inputs, will likely revolutionize everything from cybersecurity to real-time analytics. Quantum computing could be directed and augmented via artificial intelligence, operate in a 5G or 6G framework, support IoT, and catalyze materials science, biotech, genomics, and the Metaverse.
A market report sums up the promise and the race for supremacy of quantum technologies. “Advances in quantum computer design, fault-tolerant algorithms and new fabrication technologies are now transforming this “holy grail” technology into a realistic program poised to surpass traditional computation in some applications. With these new developments, the key question that companies are asking is not whether there will be a quantum computer, but who will build it and benefit from it.” Quantum Computing Market & Technologies – 2018-2024” report
According to David Awschalom, Liew Family Professor in Molecular Engineering and Physics at the University of Chicago, senior scientist at Argonne National Laboratory, director of the Chicago Quantum Exchange, and director of Q-NEXT, a Department of Energy Quantum Information Science Center:
“In the next five years, we anticipate the emergence of metropolitan-scale entangled quantum networks for secure communication. These networks may also be used to create small clusters of quantum machines for advanced computing. We also believe that quantum sensors will be employed to significantly improve clocks, mapping, and intracellular sensing.” David Awschalom | Chicago Quantum Exchange
Going into 2023 we will continue to be in an era of quantum discovery. However, we certainly are on the pathway to a new quantum era. Quantum computing is still in a nascent stage, but we may arrive there sooner than we imagined. In the future, it will be the combination of classical, biological, chemical, and quantum computing paired with artificial intelligence that will shift the computing paradigms as we currently know them.
3) Robotics
Robotics are often viewed as the face of emerging technology, especially with the growth of capabilities in humanoid type machines that captivate our attention. They have also been used for decades in automating manufacturing, farming, warehouse functions, hospitals, security, etc. for mostly routine programmable functions. Now aided by machine learning, machine vision, AI and advanced sensors, robotics has become transformative in many industry verticals.
“By combining machine vision with learning capabilities, roboticists are opening a wide range of new possibilities like vision-based drones, robotic harvesting, robotic sorting in recycling, and warehouse pick and place. We’re finally at the inflection point: The moment where these applications are becoming good enough to provide real value in semi-structured environments where traditional robots could never succeed.” 2022: A major revolution in robotics | ZDNET
A humanoid robotic version of the Terminator has also become within technological reach. Engineers at Cornell University have created a robot capable of detecting when and where it has been damaged and then restoring itself on the spot. Terminator-style robot can survive being STABBED | Daily Mail Online
The extent that humans are replaced by robot helpers or morphed into man-machines is an interesting philosophical question. Joan Slonczewski, a microbiologist at Kenyon college, notes that humans have continuously redefined intelligence and transferred those tasks to machines. Slonczewski asks: “Could we evolve ourselves out of existence, being gradually replaced by the machines?” Intelligent Robots Will Overtake Humans by 2100, Experts Say | Live Science
Robotics have great potential for space and ocean exploration in extreme environments. In fact, NASA’s Jet Propulsion Laboratory in Southern developed a robotic arm called the Cold Operable Lunar Deployable Arm (COLDArm), that is designed to withstand temperatures of minus-280 degrees Fahrenheit and can allow future missions to explore the moon and planets. NASA Developing Robotic Arm to Withstand Moon’s Frigid Nights (gizmodo.com) Robotics will allow mankind to explore and go where no man (or woman) can boldly go before.
4) Advanced Materials Science and 3-D Printing
Applications for advanced materials are part of the new world of discovery enabled by AI, and advanced analytics. As with AI, it applies to the fusion of working in concert with other emerging technologies. New materials are now being developed that can be stronger, lighter, handle extreme environments, and often can function at a higher rate of efficiency. They include, (among other categories) electric materials, biological materials, composites, polymers, and nanotech.
Exciting research and development in materials science are leading to the creation of stronger, durable, lighter, and even “self-healing” and self-assembling materials. The capability to design and manufacture infrastructures such as bridges, roads, buildings with stronger, adaptable, self-intelligent, and seemingly eternal materials will revolutionize the construction and transportation industries.
3-D printing: 3-D printing connotes a three-dimensional object that is created layer by layer via computer aided design programs. To be able to print the object, the computer divides it into flat layers that are printed one by one. By printing with advanced pliable materials such as plastics, ceramics, metals, and graphene there have already been breakthroughs in prosthetics for medicine and wearable sensors.
The big advantage of 3-D printing can be customized, produced rapidly and is cost-effective. The possibilities for 3-D printing are seemingly limitless. 3-D printing is already trailblazing future manufacturing. 3-D printing innovation is making its way into printing electronics, sensors, and circuits. “Printed electronics” or electronic chips are fabricated by printing their features on top of thin surfaces. Using semiconducting and conductive inks and materials, 3-D printers can now print transistors, sensors, circuits, batteries, and displays.
Bioprinting: One of the most important applications of 3-D printing and materials science is in the potential for bio-printing. Bioprinters are like 3D printers but instead use biomaterials, like living cells to create complex structures such as blood vessels or skin tissue. Researchers have bio-printed human kidneys, bladders, and lungs in the lab, and other body parts. Recently, a research team based out of the Wake Forest Institute for Regenerative Medicine succeeded in creating what could be one of the biggest breakthroughs in bioprinting thus far, a 3D printer capable of generating functional replacement tissue. Bioprinting Breakthroughs | Bioprinting World
As there is a worldwide shortage of human organs available for life saving transplantation, bio-printing offers much hope in creating a safe and endless pipeline for medical uses and patient healthcare.
The societal, scientific, and economic impact of these 4 technology areas will be harvested in the coming year and into the foreseeable future. I have just provided a glimpse of a few of their potential applications. It could be a very promising, likely disruptive, and probably a wild ride in the quest for actualizing new technologies of the Fourth Industrial Era.
About The Author
Chuck Brooks, President of Brooks Consulting International, is a globally recognized thought leader and subject matter expert Cybersecurity and Emerging Technologies. Chuck is also an Adjunct Faculty at Georgetown University’s Graduate Cybersecurity Risk Management Program where he teaches courses on risk management, homeland security technologies, and cybersecurity. He is also IEEE Cyber Security for Next Generation Connectivity Systems for Quantum IOT Vice-Chair and serves as the Quantum Security Alliance Chair for IOT. LinkedIn named Chuck as one of “The Top 5 Tech People to Follow on LinkedIn.” He was named as one of the world’s “10 Best Cyber Security and Technology Experts” by Best Rated, as a “Top 50 Global Influencer in Risk, Compliance,” by Thompson Reuters, “Best of The Word in Security” by CISO Platform, and by IFSEC, and Thinkers 360 as the “#2 Global Cybersecurity Influencer.” He was featured in the 2020, 2021, and 2022 Onalytica “Who’s Who in Cybersecurity” He was also named one of the Top 5 Executives to Follow on Cybersecurity by Executive Mosaic, He is also a Cybersecurity Expert for “The Network” at the Washington Post, Visiting Editor at Homeland Security Today, Expert for Executive Mosaic/GovCon, and a Contributor to FORBES. He has an MA in International relations from the University of Chicago, a BA in Political Science from DePauw University, and a Certificate in International Law from The Hague Academy of International Law.
Read the full article here