By
Paula ParisiJuly 15, 2024
OpenAI has partnered with the Los Alamos National Laboratory to study the ways artificial intelligence frontier models can assist with scientific research in an active lab environment. Established in 1943, the New Mexico facility is best known as home to the Manhattan Project and the development of the world’s first atomic bomb. It currently focuses on national security challenges under the direction of the Department of Energy. As part of the new partnership, the lab will work with OpenAI to produce what it describes as a first-of-its-kind study on the impact of artificial intelligence and biosecurity. Continue reading OpenAI Teams with Los Alamos for Frontier Model Research
By
Paula ParisiNovember 15, 2022
Intel is taking on Nvidia and AMD with its Max Series for high performance computing and artificial intelligence. The company unveiled two products under the Max umbrella: the Intel Xeon Max CPU and the Intel Data Center Max Series GPU. The Max GPU is Intel’s highest density processor, packing over 100 billion transistors into a 47-tile package with up to 128GB of high-bandwidth memory. The oneAPI open software ecosystem provides a single programming environment for both new processors, with Intel’s 2023 oneAPI and AI tools enabling the Intel Max Series products’ advanced features. Continue reading Intel Targets Supercomputing with New Max Series CPU, GPU
By
Debra KaufmanFebruary 12, 2020
Artificial intelligence and quantum computing would be awarded increased funding under the Trump administration’s proposed $4.8 trillion budget. The Defense Department and the National Science Foundation would receive more funds for AI research, and $25 million would go towards the creation of a national “quantum Internet,” aimed at making it more difficult to hack into digital communications. The proposed funding comes at a time that China has prioritized both new technologies, and the U.S. seeks to catch up. Continue reading Trump Administration Plans to Fund AI, Quantum Computing
By
Rob ScottJanuary 20, 2020
During CES 2020 in Las Vegas this month, IBM announced its continued efforts to develop practical applications using quantum computing. The company emphasized the expansion of IBM Q Network, which now includes more than 100 organizations across industries such as air travel, automotive, banking, electronics, energy, health and insurance. IBM announced new collaborations with Anthem, Delta Air Lines, Georgia Tech, Goldman Sachs, Los Alamos National Laboratory, Stanford University, Wells Fargo and Woodside Energy, in addition to a number of government research labs and startups. Continue reading IBM Expands Partnerships to Advance Quantum Computing
By
Debra KaufmanOctober 19, 2018
The term “deepfakes” describes the use of artificial intelligence and computer-generated tricks to make a person (usually a well-known celebrity or politician) appear to do or say “fake” things. For example, actor Alden Ehrenreich’s face was recently replaced by Harrison Ford’s face in footage from “Solo: A Star Wars Story.” The technique could be meant simply for entertainment or for more sinister purposes. The more convincing deepfakes become, the more unease they create among AI scientists, and military and intelligence communities. As a result, new methods are being developed to help combat the technology. Continue reading Scientists and Military Look for Key to Identifying Deepfakes
By
Debra KaufmanOctober 8, 2018
Canadian company D-Wave Systems launched the Leap Quantum Application Environment, a web portal that aims to offer public access to quantum computing for “any and all developers.” D-Wave R&D executive vice president/chief product officer Alan Baratz says Leap will provide such developers “immediate, free, real time access to a live quantum computer.” Quantum computing, which is expected to dramatically improve the ability to manipulate and analyze data, has thus far had a very limited user base. Continue reading D-Wave Offers Free Real-Time Quantum Computing For All
By
Debra KaufmanMarch 10, 2016
The advent of digital acquisition has made long-term storage more complicated for media and entertainment companies, which up until now have been dependent on tape-based solutions. Now, Sony has unveiled Everspan, an optical disc technology it guarantees will last for 100 years. That 100-year guarantee would relieve companies of the expensive, time-consuming need to migrate libraries to new technology. Each disc stores 300 gigabytes, and Everspan uses up to 64 drives to read data at extremely high speed. Continue reading Sony Introduces Optical Disc Archival System to Replace Tape
By
Debra KaufmanAugust 28, 2015
Cybersecurity technology from Los Alamos National Laboratory is now available to banks and other private sector businesses, via the consulting firm Ernst & Young. The New Mexico lab, benefitting from the $1 billion the U.S. spends a year on unclassified cybersecurity research, has developed a great deal of relevant technology, but is not set up to market the results of its own research. Ernst & Young, which consults on cybersecurity, will communicate the lab’s products and add its own expertise. Continue reading New Initiative: U.S. Offers Cybersecurity Tech to Private Sector