Regulate Tech More Say Californians
The technology industry should be more regulated say 58% of 1,500 Californians responding to a January 2019 survey by Edelman Trust Barometer (Edelman.com). That compares with the 46% Edelman surveyed in 2018. And 69% of tech workers in the sample say their industry has been under-rather than over-regulated.
Still, 92% believe the industry ”is good at what it does,” and 78% want an ok to be active members of the communities in which they operate.
China Publishing More Top AI Papers than U.S.
If current trends continue, China will publish more of the 10% most- cited AI papers than the U.S. in 2020, and more of the 1% most-cited papers by 2025. That’s what Field Cady and Oren Etzioni of the Allen Institute for Artiﬁcial Intelligence learned from the Semantic Scholar project’s analysis of more than two million academic AI papers published through the end of 2018.
“Citation counts are a lagging indicator of impact, so our results may understate the rising impact of AI research originating in China,” they write (medium.com, March 13). “This focus on high-impact papers,” they add, “shows a clear trend of Chinese ascendance in the ﬁeld of AI.”
Blockchain Online Payment Network Planned
Akamai Technologies and Mitsubishi UF J Financial Group will form a joint venture with plans for a blockchain-based online payment network. Their Global Open Network will enable “next-generation transaction security, scale and responsiveness,” according to the announcement.
U.S. R&D May Have Risen to $542 billion in 2017
Latest data from the National Science Foundation’ s Center for Science and Engineering Statistics indicate that research and experimental development performed in the U.S. totaled $515.3 billion in 2016 (current dollars). Estimated total for 2017, based on performer-reported expectations, is $542.2 billion.
These totals compare with $404.8 billion (current dollars) for U.S. R&D in 2008, just before the onset of the main economic effects of the ﬁnancial crisis and recession. Details in InfoBriefs, NSF 19-308, Feb. 27, 2019.
Project $3 trillion IoT Market by 2026
Integration of all our devices with internet connectivity should boost IoT market to over $3 trillion annually by 2026, according to the third annual Global IoT Executive Survey from Business Insider Intelligence newsletter (businessinsider.com).
Based on responses from 400 executives around the world, Business Insider projects the 10 billion devices in 2018 will have surged to more than 64 billion by 2025.
IBM’s Cloud Predictions
Integrated public and private clouds will replace “one cloud ﬁts all” for many organization in 2019. That’s the ﬁrst of ﬁve IBM cloud predictions from IBM “Cloud computing news,” Dec 31, 2018. (ibm.com). The others: More companies will embrace open cloud technology; teams will need new skills for API management, data integration and more; security best practices will be essential; edge computing will “explode.”
The Blockbuster in Bend, Oregon became the sole remaining Blockbuster on Earth when the last Australian store closed in March, according to the Australian Associated Press report on cnet.com, 3/7/2019.
Blockbuster peaked in 2004, when it had 60,000 employees and 9,000 stores worldwide, International Business Times reported. Market value of the home movie/video game rental chain was $5 billion and revenues were $5.9 billion then. Competition from streaming services received some of the blame.
Cutting AI Training Time ‘Drastically’
North Carolina State University researchers have announced a technique that cuts training time for deep learning networks by as much as 68% without sacriﬁcing accuracy.The technique was reported in a paper by Xipeng Shen, a professor of computer science at NC State, and Ph.D. students Lin Ning and Hui Guan.
“One of the biggest challenges facing the development of new AI tools is the amount of time and computing power it takes to train deep learning networks to identify and respond to the data patterns that are relevant to their applications.,” said Prof. Shen. “We’ve come up with a way to expedite that process, which we call Adaptive Deep Reuse. We have demonstrated that it can reduce training times by up to 69 percent without accuracy loss.”
Tests on three networks and data sets widely used as deep learning test beds “demonstrates that the technique drastically reduces training times,” said Hui Guan.
The paper, “Adaptive Deep Reuse: Accelerating CNN Training on the Fly,” was presented at the 35th IEEE International Conference on Data Engineering, April 8-11 in Macau SAR, China.