EA Announces New AI-Powered, Cloud-Native Gaming Tech

Electronic Arts unveiled Project Atlas, its “cloud-native gaming” technology, via a Medium blog post by chief technology officer Ken Moss. Although he did not say when it would be fully deployed and functional, Moss described Project Atlas as designed to “harness the massive power of cloud computing and artificial intelligence and putting it into the hands of game makers in a powerful, easy to use, one-stop experience.” The game engine combines rendering, game logic, physics, animation, audio, and more. Continue reading EA Announces New AI-Powered, Cloud-Native Gaming Tech

Blockchain-Based RNDR Harnesses Power of 14,000 GPUs

Los Angeles-based OTOY, a company that has created software used for visual effects in projects such as “Westworld” and “The Avengers,” also launched a blockchain- and cryptocurrency-based rendering platform called RNDR to help other content creators harness the power of thousands of graphics processing units (GPUs). OTOY’s strategy is to gather a group of computer owners who can share their GPUs in the cloud in a decentralized way, and thus trade GPU power among members to accomplish data-intense imagery. Continue reading Blockchain-Based RNDR Harnesses Power of 14,000 GPUs

Nvidia Ray-Tracing Technology a Quantum Leap in Rendering

At SIGGRAPH 2018, Nvidia debuted its new Turing architecture featuring ray tracing, a kind of rendering, for professional and consumer graphics cards. Considered the Holy Grail by many industry pros, ray tracing works by modeling light in real time as it intersects with objects. Ray tracing is ideal for creating photorealistic lighting and VFX. Up until now, ray tracing has not been possible to do because it requires an immense amount of expensive computing power, but Nvidia’s professional Turing card costs $10,000. Continue reading Nvidia Ray-Tracing Technology a Quantum Leap in Rendering

Nvidia Quadro RTx Chips Offer AI and Real-Time Ray Tracing

Nvidia unveiled new Turing architecture during a keynote at SIGGRAPH 2018 as well as three new Quadro RTx workstation graphics cards aimed at professionals. Nvidia dubs the Turing architecture as its “greatest leap since the invention of the CUDA GPU in 2006.” The RTx chips are the first to use the company’s ray tracing rendering method, which results in more realistic imagery. Also at SIGGRAPH, Porsche showed off car designs accomplished with Epic Games’ Unreal engine and Nvidia’s RTx chips. Continue reading Nvidia Quadro RTx Chips Offer AI and Real-Time Ray Tracing

OTOY Rolls Out Blockchain-Based Rendering Platform RNDR

OTOY, a Los Angeles-based visual effects software firm, launched RNDR to allow more people to create 3D computer generated images. The company, which created software used for productions such as “Westworld” and “The Avengers,” relied on cloud, blockchain and cryptocurrency technologies to create a way that people can create 3D imagery rendered by shared hardware hosted in the cloud, and then sold and/or traded via blockchain. Doing so, says chief executive Jules Urbach, reduces the cost, time and labor of creating such assets. Continue reading OTOY Rolls Out Blockchain-Based Rendering Platform RNDR

DeepMind Intros Intriguing Deep Neural Network Algorithm

A research team at Google’s AI unit DeepMind, led by Ali Eslami and Danilo Rezende, has created software via a generative query network (GQN) to create a new perspective of a scene that the neural network has never seen. The U.K.-based unit developed the deep neural network-based software that applies the network to a handful of shots of a virtual scene to create a “compact mathematical representation” of the scene — and then uses that representation to render an image with a new perspective unfamiliar to the network. Continue reading DeepMind Intros Intriguing Deep Neural Network Algorithm

The Best New Products Displayed at Augmented World Expo

Several demos stood out at the 9th annual Augmented World Expo in Santa Clara, California last week. The most compelling involved a holographic display from Brooklyn-based Looking Glass Factory. Co-founder and CEO Shawn Frayne and his team have been working for a few years on a technique that “blends the best of volumetric rendering and light field projection.” Also compelling was a markerless multi-person tracking system that runs off a single video feed, developed by a Canadian computer vision/deep learning company named wrnch. And marking its first exhibit in the United States since launching its latest satellite office in San Francisco this April, Japanese company Miraisens demonstrated how a suite of effects could be used to enhance extended reality experiences. Continue reading The Best New Products Displayed at Augmented World Expo

HPA 2018: Update on Tools, Production and Post in the Cloud

For this year’s Super Bowl, The Mill in London produced 25 commercials, relying heavily on the cloud. “There’s no way we could have gotten that done without a burst of rendering in the cloud,” said The Mill group technical director Roy Trosh. “When we know we have a vendor bulge, we used to bring a [server] supplier and it took three days to get ready to render. This time it took 15 minutes.” At this week’s HPA Tech Retreat, manufacturers and users described how the industry has evolved with regard to cloud production and post. Continue reading HPA 2018: Update on Tools, Production and Post in the Cloud

Unity, Filmmaker Neill Blomkamp Partner on Creative Project

Game engine Unity and filmmaker Neill Blomkamp (“Elysium,” “District 9”) have partnered to create a more fleshed out version of “Adam,” a short proof-of-concept film that Unity released to show off its Cinemachine movie creation tools. At the same time, Unity debuted a new version of Cinemachine, software that allows users to direct their own CG films. Blomkamp is an ideal partner, having just launched his own studio, Oats, and released three videos (“Rakka,” “Firebase,” “Zygote”) and other shorter projects. Continue reading Unity, Filmmaker Neill Blomkamp Partner on Creative Project

AWS Debuts Amazon Sumerian to Build VR, AR and 3D Apps

Amazon Web Services announced a new service called Amazon Sumerian during the kick-off event for its AWS re:Invent conference in Las Vegas. The service was created for developers to quickly and easily build virtual reality, augmented reality and 3D applications with minimal coding for a range of platforms including browsers, head-mounted displays, mobile devices and digital signs. Initially, Sumerian-built apps will run on any browser that supports WebGL or WebVR rendering, which includes Google’s Daydream, HTC Vive, Oculus Rift and iOS devices. Continue reading AWS Debuts Amazon Sumerian to Build VR, AR and 3D Apps

OTOY Uses Blockchain Tech for Distributed Cloud Rendering

The technology underlying Bitcoin is now under development to render 3D visual effects. Los Angeles-based OTOY, which provides a GPU-based software system to create a cloud-based pipeline for 3D content, is hoping to raise as much as $134 million to develop RNDR, distributed cloud rendering for VR and other content, via blockchain technology. HBO and Discovery have invested in OTOY, which has also partnered with Facebook and Mattel. Relying on cloud-based GPUs for rendering is a much less expensive solution than supercomputers. Continue reading OTOY Uses Blockchain Tech for Distributed Cloud Rendering

AMD Pitches Latency-Free Virtual Reality via Super-Fast Wi-Fi

Advanced Micro Devices (AMD) has acquired Nitero, a startup responsible for a 60-gigahertz wireless chip that transmits high-res video without latency. AMD, which bought the company for an undisclosed price, believes that Nitero’s chip will enable it to push sales of more wireless virtual reality headsets. Sales of VR headsets, according to AMD executive Roy Taylor, have been limited due to their need to be tethered to a computer. Nitero was originally a spinoff from a research center sponsored by the Australian government. Continue reading AMD Pitches Latency-Free Virtual Reality via Super-Fast Wi-Fi

Epic Games Demos Real-Time Effects for New Branded Short

In “The Human Race,” a short produced by visual effects house The Mill for Chevrolet, the automaker’s new 2017 Camaro ZL races a futuristic Chevy concept car, one driven by a racecar driver and the other by artificial intelligence. This short premiered at the Game Developers Conference to showcase how the car was created via real-time rendering, with the help of the Unreal game engine. Unreal maker Epic Games CTO Kim Libreri demonstrated how aspects of the movie could be changed in real-time, while it was playing. Continue reading Epic Games Demos Real-Time Effects for New Branded Short

Scandy Introduces SDK for 3D Scanning via Android Devices

Scandy, a company with technology for printing 3D images on demand, is now debuting a beta version of a $500 tool to scan objects in 3D from Android devices. The company relies on 3D sensors from chip tech provider pmd to achieve 0.3mm feature precision, a degree of resolution ordinarily found only in much more expensive toolsets. The company is also making its Scandy Core software development kit available to developers, with the idea that they will create innovative, 3D scanning products and services. The beta program is open now. Continue reading Scandy Introduces SDK for 3D Scanning via Android Devices

Researchers Develop Efficient Way to Render Shiny Surfaces

Computer scientists at UC San Diego have developed an efficient technique for rendering the sparkling, shiny and uneven surfaces of water, various metals and materials such as injection-molded plastic finishes. The team has created an algorithm that improves how CG software reproduces the interaction between light and different surfaces (known as “glints”), a technique the team claims is 100 times faster than current state-of-the-art methods, requires minimal computational resources, and is effective beyond still images to include animation. Continue reading Researchers Develop Efficient Way to Render Shiny Surfaces