Artificial Intelligence, The True Beginning Occurred In 1912
Well crafted blog entry authored by Professor Herbert Bruderer at the Communications of the ACM blog, detailing the true start of Artificial Intelligence in 1912.
'If one takes chess as a yardstick for artificial intelligence, however, this branch of research begins much earlier, at the latest in 1912 with the chess automaton of the Spaniard Leonardo Torres Quevedo (cf. Fig. 1). In the chess playing Turk (1769) of Wolfgang von Kempelen, a human player was hidden." - via Herbert Bruderer, retired lecturer of Didactics in Computer Science at ETH Zürich
Reportedly, Professor. Bruderer is now retired from his lecturer role in Didactics of Computer Science at ETH Zürich, and recently, he has fulfilled the role of Historian of Technology.
Quantum Of Tuesday: Google Quantum AI's Paper, Whereabouts Known
via Bianca Bharti - writing for Canada's National Post, comes news of Google, Inc's (Nasdaq: GOOGL) stunning accomplishment in quantum computation. Described in a paper entitled 'Quantum supremacy using a programmable superconducting processor' and published at NASA (since taken down, but, available here, along with a bibliography document entitled 'Google Quantum Supremacy (Supplementary information) 09-2019' here.). Read it and weep for the quantum advertising onslaught from Serge and Larry coupled with the complete demise of your future self's privacy in all alternate universes...
"The tantalizing promise of quantum computers is that certain computational tasks might be executed exponentially faster on a quantum processor than on a classical processor. A fundamen- tal challenge is to build a high-fidelity processor capable of running quantum algorithms in an exponentially large computational space. Here, we report using a processor with programmable superconducting qubits to create quantum states on 53 qubits, occupying a state space 253 ∼ 1016. Measurements from repeated experiments sample the corresponding probability distribution, which we verify using classical simulations. While our processor takes about 200 seconds to sample one instance of the quantum circuit 1 million times, a state-of-the-art supercomputer would require approximately 10,000 years to perform the equivalent task. This dramatic speedup relative to all known classical algorithms provides an experimental realization of quantum supremacy on a com- putational task and heralds the advent of a much-anticipated computing paradigm." - via the Google AI Quantum and collaborators, et al - enumerated within the paper here**
ACM Presents 2018 ACM A.M. Turing Award to Deep Learning Pioneers
The Association for Computing Machinery has announced the presentation of the 2018 ACM Turing Award to a trio of Deep Learning pioneers Geoffrey Hinton, Yann LeCun & Yoshua Bengio. Congratulations!
"ACM named Yoshua Bengio, Geoffrey Hinton, and Yann LeCun recipients of the 2018 ACM A.M. Turing Award for conceptual and engineering breakthroughs that have made deep neural networks a critical component of computing. Bengio is Professor at the University of Montreal and Scientific Director at Mila, Quebec’s Artificial Intelligence Institute; Hinton is VP and Engineering Fellow of Google, Chief Scientific Adviser of The Vector Institute, and University Professor Emeritus at the University of Toronto; and LeCun is Professor at New York University and VP and Chief AI Scientist at Facebook." - via the Association for Computing Machinery
MIT's Attack Detection via Super Computing
In a not-too-astounding announcement, it seems MIT Academicians have found a new use for super-computational resources: The utilization of super comuting resources targeting so-called 'compressed bundles' with the ostensible outcome of attack detection. I'll wager there are foreword thinking data scientists bent over the same workwheel using so-called 'Cloud Computing' for the same task (at tenth of a percent of the cost per flop). Just sayin...
'"If you're trying to detect anomalous behavior, by definition that behavior is rare and unlikely," says Vijay Gadepally, a senior staff member at the Lincoln Laboratory Supercomputing Center (LLSC). "If you're sampling, it makes an already rare thing nearly impossible to find."' - via The Lincoln Laboratory at the Masachusetts Institute of Technology
Net Neutrality, The Case Against →
Notwithstanding the FCC's vote in the affirmative (to strike the Net Neutrality rules), the Commission is still reportedly working on the final document, and has not released the official decision (at the time of this writing)(which may surprise some readers). Consequently, we are publishing this superlative opinion piece by Professor Harsha Madhyastha of the University of Michigan's Enjoy!
Professor Harsha Madhyastha (Associate Professor at the University of Michigan's Computer Science and Engineering Division) , writes eloquently at the IEEE's Spectrum Magazine, and enthralls us with a nicely logical case against Net Neutrality. Today's Must Read.
2017/01/06 - Update: The FCC has released the Commission's here: Order.
Arnold Nordsieck's Synchro Operated Differential Analyzer →
Ladies and Gentlemen, Girls and Boys Behold: The Nordsieck Synchro Operated Differential Analyzer.
"As with other analog computers, each calculation required its own setup. You plugged in the tangle of patch cords to the left in a particular pattern. The cords served as the computer’s control program, with other parts of the program embodied and executed by the spinning disks, gears, rotating shafts, cranks, and the like. (You can read Nordsieck’s early description of the computer here [PDF] and his written instructions here [PDF]." - via IEEE Spectrum Magazine
50 Qubits →
An astounding image (some might call it a percolator of the multiverse), and announcement via IBM (NYSE: IBM) of the company's newly minted 50 Qubit Processor. Today's Must Read.
"The first IBM Q systems available online to clients will have a 20 qubit processor, featuring improvements in superconducting qubit design, connectivity and packaging. Coherence times (the amount of time available to perform quantum computations) lead the field with an average value of 90 microseconds, and allow high-fidelity quantum operations. IBM has also successfully built and measured an operational prototype 50 qubit processor with similar performance metrics. This new processor expands upon the 20 qubit architecture and will be made available in the next generation IBM Q systems." - via
DARPA, The Implantation Instantiation →
News, via the IEEE's Spectrum Magazine (and reported by Eliza Strickland) of a challenge from DARPA's Neural Engineering Systems Design; in this case, the need for a recording brain implant has been identified. Indeed.
Steam Computation, Polynomial Edition
Constructed by Dr. Piers Plummer and Team (Dr. Doron Swade, Professor Adrian Johnstone and Professor Elizabeth Scott), direct from the Department of Computer Science at Royal Holloway University of London comes this superlative steam driven compute device... Eagle-eyed readers may note the brass bits dropping onto the floor plate of the device (due to the gear-teeth grinding against the opposing gear's cog-teeth). H/T
The IoT Chain →
Meanwhile, in troubling IoT news, a paper (published by the IACR) entitled "IoT Goes Nuclear: Creating a ZigBee Chain Reaction" & authored by Eyal Ronen, Colin O’Flynn, Adi Shamir and Achi-Or Weingarten (a Weizmann MSc student); we find - perhaps - the ultimate ZigBee nightmare... Today's Must Read (and while your're at it, check out the video to round out your day). Thanks and Tip O' The Hat
MOAAB* →
via Vice's Motherboard writer Jason Koebler, comes this bad-news-for-advertisers screed detailing the work of Princeton and Stanford researchers to corral said ad-miscreants... The research team has crafted a computer-vision-based ad-blocker, that is reportedly 100% efficient in it's intended purpose. Phenomenal.
* Mother of All Ad Blockers
Myctyris Longicarpus, Ratiocinor Infra Aedificium →
Via Futility Closet comes an outstanding computational methodology utilizing blue soldier crabs as the componentized logic delivery mechanism for a bio-computational device (in this case - a logic gate). Certainly today's MustRead.