News for the week ending June 16, 2017 includes accounts of: US government investment in AI research lagging private industry; Baidu of China holding mass translation sessions to gain an AI edge; a new generation of AI chips designed to move AI from the cloud to the device, usually a mobile phone; and legacy of outgoing GE CEO Jeff Immelt of a big bet on AI.
Private Industry Outpacing
Government Investment in AI research
Research funded by the U.S. government has led to many innovations in science and technology, and spawned many companies in private industry. Today the government appears to be ceding its leadership role in science investments to the private sector. The New York Times reported recently that Google, Amazon, Apple, Facebook and Microsoft together are on a track to spend $60 billion this year on R&D. The US federal government in comparison, spent about $67 billion in non-defense-related scientific research in 2015.
The federal government spent $1.1 billion in unclassified AI research in 2015, according to a White House report issued in the late stages of the Obama administration.
Some are concerned that not everyone will benefit from AI advances if industry leads the charge in investing. “As powerful AI systems come online, the intent and motivations of their creators will be major factors in determining their impact,” said Greg Brockman, a founder and chief technology officer of OpenAI, an AI research firm. “If AI development is done entirely in for-profit companies, then these systems are likely to be deployed to benefit just one organization or group of people.
Many in the technology industry do support greater federal funding for research. Eric Schmidt, executive chairman of Alphabet, wrote an op-ed in the Washington Post arguing that federal funding of science and technology had created a “miracle machine” of public-private partnership, “one of America’s greatest advantages.”
For more information, go to the New York Times.
Mass Translation Sessions
A recent piece in Bloomberg Businessweek outlined efforts the Chinese company Baidu is making in artificial intelligence. Baidu holds mass translation sessions involving thousands of translators across China rendering brochures, letters and technical manuals written in foreign languages to Chinese characters. Baidu is using this process to help build its English to Mandarin word pairs, in order to train its translation engine.
Often described as the Google as China, Baidu is spending an estimated $2.9 billion on research and development over the past two-and-a-half years, most of that on AI. Baidu is estimated to have 100 million word pairs of English-to-Chinese terms, while Alphabet is estimated to have 500 million, according to a 2016 article in Science magazine.
Facing more competition and slower growth in the search ad business, Baidu is looking to AI for diversification and growth. Baidu CEO Robin Li was quoted as saying, “The era of mobile internet has ended. We’re going to aggressively invest in AI, and I think it’s going to benefit a lot of people and transform industry after industry.“
For more information, go to Bloomberg Businessweek.
“AI on the Edge” to Rely on
New Chips, Volta Language
AI is a runaway success and will be a huge driver of future economic growth, as we reported in this week’s newsletter on a report by Seeking Alpha, an investment research firm run by investors and industry experts, and not sell-side analysts.
Especially interesting in this report is a developing new generation of AI chips, designed to move AI from the cloud to the device, usually a mobile phone. NVIDIA is designing new architecture to be programmed with Volta, a systems-level programming language designed specifically for data processing and AI in particular.
Tensor Cores are a breakthrough technology designed to speed up AI workloads. Volta is used to unify Tensor Cores, thus generating 12 times more throughput than Pascal, which has performed well in deep learning. The Volta GPU architecture from NVIDIA is reported to offer deep learning performance of 100 Teraflops (a measure of GPU power) per second speed with 640 Tensor Cores. Google is reported to have published some recent benchmarks showing the TensorFlow scales almost linearly with the number of Volta P100 GPUs actually used.
These advances may enable “AI on the edge” – on the device rather than in the cloud – which would have these advantages:
- Better privacy and security
- Lower latency
- Works without internet connection
Current AI assistants such as Siri, Cortana and Alexa, are all hosted in the cloud and require Internet connections to access. The reason is AI functionality requires horsepower that only data centers can provide. This poses a privacy issue since cloud-hosted AI is most effective when observing the actions of the user. Thus, companies are looking for ways to host more AI functionality on the device.
In addition to NVIDIA, work on chips to enable AI on the edge is happening at ARM, IBM, Qualcomm, Apple and a startup called Groq, founded by some of the people who developed the Tensor at Google. Seeking Alpha notes the development of specialist AI chips for local rather than cloud use, is only starting today, and could lead to a new market.
For more information, go to Seeking Alpha.
Jeff Immelt Staged AI Roll Out
at GE Then Passed CEO Baton
General Electric announced on Monday that Jeff Immelt is retiring as CEO and John Flannery is taking over. On the job for 16 years, succeeding Jack Welch, Immelt led an effort to have GE invest heavily in software and artificial intelligence, as he refocused the company on industry and away from finance.
GE, which moved its headquarters to Boston last year, is involved these core businesses:
Engines and Generators: Machine representing the advanced applications of the 125-year old company, with a basis in materials science and physics. These products also exploit what Immelt had called the “third pillar” of GE technology in the future – data and smart software. Sensors on a new jet engine, for example, stream data on temperature, fuel consumption, vibration and other measurements so that the engine can signal when it needs preventive maintenance. Services over the life of an engine means revenue for 30 years or more, which analysts estimate is eight times the value of the engine sale.
Oil Field Gear: GE is working on a deal to merge its oil and gas business, which has been under pressure, with Baker Hughes. The same rationale applies: heavy equipment brings in service revenue, and lends itself to efficiency gains using sensor data and software.
Healthcare: GE’s healthcare business combines and sells medical imaging equipment. In a recent initiative announced with Partners HealthCare in Boston, GE committed to employ AI to improve healthcare. The effort will include clinical and technology experts at Mass General and Brigham and Women’s hospitals, working with engineers and developers at GE. The companies will begin by working on software to more quickly interpret medical images; over time, they want to create applications for genomics, population health and other areas of medicine.
Flannery, then chief executive of GE Healthcare, was quoted in the Boston Globe as saying, “What we see as the future of healthcare is applying data and analytics and machine learning to create a rapidly different outcome for patients. The possibilities are vast and significant.”