NVIDIA Announces Omniverse Replicator Synthetic AI Data Generation Engine
Powerful system produces physically simulated data for training neural networks; first implementations shown are NVIDIA DRIVE Sim and NVIDIA Isaac Sim apps.
Powerful system produces physically simulated data for training neural networks; first implementations shown are NVIDIA DRIVE Sim and NVIDIA Isaac Sim apps.
In his virtual GTC keynote, company founder and CEO Jensen Huang introduced a host of new technologies, including his Toy Jensen miniature avatar, to demonstrate transformative uses of the tech giant’s latest offerings in accelerated computing and the Omniverse Enterprise virtual world simulation and collaboration platform.
Coming Wednesday, November 10, virtual event will show how to create stylized character-based animation using UE and Autodesk Maya, exploring open standards like Alembic and USD to quickly move work between the two platforms.
This free presentation on November 17 will showcase how companies can address remote and hybrid work challenges with NVIDIA’s new real-time collaboration and simulation platform, Omniverse Enterprise; PNY is also hosting a series of Omniverse Enterprise sessions at GTC 2021, NVIDIA’s free digital conference running November 8-11.
The tech giant’s free virtual conference runs November 8-11, featuring 500+ sessions with industry leaders; highlights include founder and CEO Jensen Huang’s keynote, a discussion on the potential of USD for 3D creators, and how the Omniverse, USD, and real-time rendering are transforming global film production and workflows.
Available now for download and free use in UE 4.27 and UE5 Early Access, the nimble 3DCG warrior stars in studio’s upcoming content sample project, ‘Slay,’ which will be downloadable later this month.
Produced using Unreal Engine and innovative virtual production tools, the CG film is set in the same dystopian world as the ‘Zombie Army’ video game franchise; use of game assets allowed fast iteration and production timeframe.
Aardman and Fictioneers’ new city-scale augmented reality experience transforms Bristol and Cardiff in the UK, and San Francisco in the US, into virtual stages; for a limited time, the app is free on iOS and Android devices in the UK.
Now available, the latest update to Epic Games' real-time game engine boasts improvements for creators across multiple industries such as game development, film and television, and architecture; enhanced features include improved VFX workflows, faster light baking, integrated RAD game tools, and production-ready pixel streaming.
Update allows users to explore V-Ray scenes in real-time, render high-quality lightmaps for light baking, and render photorealistic and ray-traced images and animation.
New design makes RTX technology fit more desktops, making AI acceleration and real-time ray tracing available for more professionals and their design workflows.
Groundbreaking simulation and collaboration platform introduces Blender and Adobe integrations that will open up system to millions of additional users.
Newly launched NEP Virtual Studios acquires leading real-time production companies to expand its growing virtual production business; former Digital Domain CEO Cliff Plumer named group leader.
New enhanced simulation platform with robotics specific extensions provides essential features for building virtual robotic worlds and experiments.
Leading visualization company and their ARENA virtual camera system helped the filmmaker and his VFX team plan, visualize, shoot, and deliver key astronaut spacewalk and airlock death sequences on their Oscar-nominated, VES Award-winning sci-fi drama.
Latest deep dive under the hood of Unreal Engine details challenges and real-time solutions for creating realistic-looking hair and fur.
Digitized shots of Icelandic snow, glaciers, and misty weather, coupled with real-time camera tracking and Unreal Engine, meant high-resolution environment projections, and lighting, could be captured in-camera to provide unprecedented realism for onstage shooting, in George Clooney and Netflix’s Oscar-nominated sci-fi drama.
The Sony Pictures Imageworks senior VFX supervisor discusses his studio’s integration of Epic’s game engine technology and real-time workflows into its animation production on a new episode of Netflix’s ‘Love, Death & Robots’ anthology series.
Transmedia startup Voltaku, fueled by an Epic MegaGrant, begins development on animated series about two siblings’ adventures in a futuristic megacity, directed by Oscar-nominee Ruairi Robinson and penned by ‘Spine of Night’ director Philip Gelatt.
Haz Dulull and He Sun deliver an exciting game cinematic spot for Rebellion’s latest release, produced with a USD-based Houdini pipeline, Unreal Engine, and Redshift.
Set for March 27, the leading visualization studio will share a more complete view of how virtual production and cutting-edge real-time workflows are being used to produce films and episodic series.
Brian Selznick to pen script for first ever animated version of F. Scott Fitzgerald’s 1925 classic American novel.
For Chinese director Yibing Jiang, Unity’s real-time tools enabled a small global team of artists to work quickly and efficiently, in parallel rather than sequentially, to produce a beautiful and touching film about the bond between a father and his ailing daughter.
Animated feature, based on the director’s award-winning short, will blend 2D and 3D cinema, animation, and video game development techniques; production begins Spring 2021 with the premiere planned for 2023.
Studio’s first interactive episode, part of the festival’s New Frontier Gallery, uses interactive cloud streaming technology to enable seamless interaction and high-end graphics on mobile platforms.