By
Rochelle WintersFebruary 20, 2020
The HPA Tech Retreat kicked off with an ambitious daylong demo that highlighted innovations in content creation, management and distribution technology and workflows. Supersession chair Joachim Zell, VP technology for EFILM walked the audience through numerous elements of an HDR production: filming, editing and finishing two scenes that provided the final chapters for a short film. The process, much of which involved workflows in the cloud, featured multiple cameras, on-set management and collaboration platforms, editorial, dailies and digital intermediate color grading systems, as well as online mastering and distribution platforms. Continue reading HPA Tech Retreat: The Latest Workflows for Virtual Production
By
Erick MoenJanuary 13, 2020
CES is not a computing show, but this year’s edition felt silicon-centric thanks to major announcements from Intel and AMD. Intel revealed more details about its next CPU, Tiger Lake, that boasts improved performance on graphics and AI. The company also offered a glimpse of its first discrete GPU. But the show arguably belonged to AMD, which continued its year-long renaissance with a keynote unveiling mobile CPUs, a new midrange GPU, and the world’s fastest workstation processor. Continue reading AMD vs. Intel: The Computing Wars Ramp Up in Las Vegas
By
Debra KaufmanJanuary 6, 2020
Apple inked a multi-year licensing agreement with U.K. company Imagination Technologies, giving it “wider range” access to that company’s IP including a new ray-tracing technology. Observers believe the move signals that Apple plans on adding ray tracing to its chips “in the foreseeable future.” Ray tracing is a graphics technology that enables imagery to be created with real-world lighting, reflections and shadows, creating a much more photorealistic result. Nvidia first brought ray tracing to PC GPUs in August 2018. Continue reading Apple Inks Deal with Imagination for Ray-Tracing Chip Tech
By
Debra KaufmanAugust 26, 2019
Los Altos, CA-based startup Cerebras, dedicated to advancing deep learning, has created a computer chip almost nine inches (22 centimeters) on each side — huge by the standards of today’s chips, which are typically the size of postage stamps or smaller. The company plans to offer this chip to tech companies to help them improve artificial intelligence at a faster clip. The Cerebras Wafer-Scale Engine (WSE), which took three years to develop, has impressive stats: 1.2 trillion transistors, 46,225 square millimeters, 18 gigabytes of on-chip memory and 400,000 processing cores. Continue reading Cerebras Builds Enormous Chip to Advance Deep Learning
By
Debra KaufmanJune 7, 2019
British startup Graphcore has developed an AI chip for computers that attempts to mimic the neurons and synapses of the human brain, so that it can “ponder” questions rather than analyze data. Up until now, said Graphcore co-founder and chief executive Nigel Toon, GPUs and CPUs have excelled at precision, using vast amounts of energy to achieve small steps. Toon and Graphcore co-founder and CTO Simon Knowles dub their less precise chips as “intelligence processing units” (IPUs), that excel at aggregating approximate data points. Continue reading Graphcore Builds Intelligence Processing Units For Better AI
By
Rob ScottMarch 19, 2019
Nvidia made a number of compelling announcements at this week’s GPU Technology Conference (GTC 2019) in San Jose, California. The company unveiled its GauGAN AI image creator that uses generative adversarial networks (GANs) to turn sketches into nearly photorealistic images. As part of its cloud pursuits, the company unveiled its latest RTX server configuration that is designed for Hollywood studios and those who want to create visual content quickly (each server pod can support up to 1,280 GPUs). Nvidia also announced partnerships with 3D software makers including Adobe, Autodesk and Unity to integrate Nvidia’s RTX ray-tracing platform. Continue reading Nvidia Demos New Products at Deep Learning & AI Confab
By
Debra KaufmanOctober 22, 2018
Los Angeles-based OTOY, a company that has created software used for visual effects in projects such as “Westworld” and “The Avengers,” also launched a blockchain- and cryptocurrency-based rendering platform called RNDR to help other content creators harness the power of thousands of graphics processing units (GPUs). OTOY’s strategy is to gather a group of computer owners who can share their GPUs in the cloud in a decentralized way, and thus trade GPU power among members to accomplish data-intense imagery. Continue reading Blockchain-Based RNDR Harnesses Power of 14,000 GPUs
By
Debra KaufmanMay 30, 2018
Facebook has used Intel CPUs for many of its artificial intelligence services, but the company is changing course to adapt to the pressing need to better filter live video content. At the Viva Technology industry conference in Paris, Facebook chief AI scientist Yann LeCun stated that the company plans to make its own chips for filtering video content, because more conventional methods suck up too much energy and compute power. Last month, Bloomberg reported that the company is building its own semiconductors. Continue reading Facebook to Develop Live Video Filtering Chips for Faster AI
By
Yves BergquistJanuary 10, 2018
CES 2018 is out the gates, and, as expected, artificial intelligence is still very much present in products, conversations and conference panels. Still in its quest to become synonymous with AI, Nvidia did not disappoint at its press event Sunday and its “Autonomous Machines” keynote Tuesday morning. From doubling down on autonomous vehicles to AI-composed music (in partnership with Disney), to a technically impressive foray into intelligent video analytics to power smart cities, the CES darling is still — by far — the biggest AI enthusiast at the show. Continue reading Artificial Intelligence Front But Not Center at CES Trade Show
By
Debra KaufmanNovember 13, 2017
Advanced Micro Devices (AMD), Intel and Nvidia are racing to develop artificial intelligence chips as the market for AI hardware and software skyrockets. Nvidia, which has specialized in high-end GPUs, and AMD, its chief rival, have found that their products have proven useful in AI applications, an incentive for them to focus on that sector. Growth in the semiconductor industry has been volatile in recent months, leading to consolidation, such as the recently announced $105 billion bid by Broadcom to acquire Qualcomm. Continue reading AMD, Intel, Nvidia Race to Build AI Chips for Booming Market
By
Debra KaufmanAugust 28, 2017
At The Reel Thing, an AMIA (Association of Moving Image Archivists) conference, Hollywood technologists and filmmakers gathered to hear presentations on challenges in restoration, remastering and archiving. PurePix Images chief executive Michael Inchalik and University of Georgia mathematics professor Alexander Petukhov looked at how Algosoft is developing software to repair vertical scratches, one of the toughest challenges in digital restoration. “We’re discussing a high-level restoration workflow,” said Inchalik. Continue reading New Software Tackles Scratch Removal for Film Restoration
By
Debra KaufmanApril 27, 2017
At the ETC conference on VR/AR, Source Sound VR’s Linda Gedemer moderated a panel of audio experts to talk about today’s VR audio tools, the trends animating VR audio, and their wish list for future technology. Everyone agreed that the audio toolsets for VR/AR seem to change nearly every day, although a handful of tools — such as Dolby Atmos and game tool Wwise — stand out as being widely accepted in this industry sector. Panelists from OSSIC, Nokia, and Source Sound VR described the toolsets they use. Continue reading NAB 2017: A Look at the Evolving Trends in VR and AR Audio
By
Debra KaufmanApril 14, 2017
Advanced Micro Devices (AMD) has acquired Nitero, a startup responsible for a 60-gigahertz wireless chip that transmits high-res video without latency. AMD, which bought the company for an undisclosed price, believes that Nitero’s chip will enable it to push sales of more wireless virtual reality headsets. Sales of VR headsets, according to AMD executive Roy Taylor, have been limited due to their need to be tethered to a computer. Nitero was originally a spinoff from a research center sponsored by the Australian government. Continue reading AMD Pitches Latency-Free Virtual Reality via Super-Fast Wi-Fi
By
Debra KaufmanAugust 17, 2016
After first debuting the Maxwell-based GTX 980 graphics chip in a notebook last year, Nvidia has now upped its game, with notebooks and laptops powered by its GTX 1000 series chips, more specifically the GTX 1060, GTX 1070 and GTX 1080. These new GPU chips, which Nvidia declares “VR-ready,” use the company’s more efficient Pascal architecture to provide nearly identical operation to their desktop chips; only the GTX 1060 provides a slightly slower base clock speed in a notebook. Continue reading Nvidia’s New GTX Series Super-Powers Laptops, Enables VR
By
Debra KaufmanJune 2, 2016
Amazon is testing an as-of-yet unannounced new cloud service that will let businesses run a wider range of artificial intelligence software on its computers, say people close to the situation. This move puts Amazon, which launched Amazon Web Services in a limited offering in this area last year, in closer competition with Google, Microsoft and IBM, which have already launched various cloud services. The new service will help development of pattern recognition, speech transcription and other robust applications. Continue reading Amazon Creating New Cloud Services for Artificial Intelligence