Skip navigation

December 17th, 1903 was a day that changed the world forever. On that day, a pair of seemingly ordinary bicycle makers from Dayton, Ohio launched what looked like a propeller-driven box-kite along the Atlantic Ocean beach of Kitty Hawk, North Carolina — achieving history’s first powered flight.


Fast-forward 112 years. Most people today are familiar with the Wright Brothers — Wilbur and Orville Wright — but very little else is known about them. In fact, there probably a better chance that somebody could tell you who your favorite celebrity is dating at the moment than the name of the Wright’s first powered airplane, now is on permanent display at the Smithsonian’s National Air and Space Museum in Washington, DC. Significantly, it is the only aircraft in the entire Smithsonian museum that has a separate and dedicated exhibit hall all to itself.


Why would such a famous plane — an aircraft that achieved the world’s first powered flight, and occupies such exalted exhibition space at the most visited museum in the world — be a mere afterthought among today’s public?


“I think people just don’t really understand what an achievement it was that Wilbur and Orville did on that cold December day in 1903,” says investment banker-turned-film producer Jonah Hirsch, sitting in his office in Beverly Hills.


Hirsch, who has spent the better part of the last 5 years attempting to secure funding for a feature film about the epic story of the Wright Brothers laments, “Studios want super-heroes, they want war, they want vampires — nobody seems to be interested in human achievement anymore except the indie financiers. To do this story justice, the film can’t be done on a typical indie budget so the options aren’t very good.”

Not somebody to give up easily, Hirsch turned his sights on making an IMAX movie for the museum audiences.


“I grew up around the Washington, DC area as a kid, and every year we would visit the Air and Space Museum on school field trips,” says Hirsch. “Without question the best part of the visit to the museum was buying the freeze-dried astronaut Neapolitan ice cream from the gift shop, and then settling in to watch the IMAX movie ‘To Fly’.”


Hirsch was visiting the museum on a trip to discuss the feature film with the Smithsonian’s top Wright Brothers historians, Dr. Tom Crouch and Dr. Peter Jakab. To his surprise, the same IMAX film he saw as a kid, was still playing at the Smithsonian. One of the original IMAX movies made for the museum’s giant screen, “To Fly” has been in continual exhibition at the museum for more than 30 years. Hirsch and his partners bought tickets and watched the iconic 40-minute movie. Afterward, Hirsch turned to his partners and said, “We’re making an IMAX movie”.


Hirsch had never made IMAX movie — but he didn’t view it as a problem. Hirsch and his partners hired well-known Hollywood creative talent to develop an original script. They also hired acclaimed street artist Shepard Fairy to create original artwork for the project in an attempt to make the project “feel a bit hipper” for today’s audiences, and hopefully, more attractive to financiers.


The project started taking shape in October of 2014. James Knight, one of Hollywood’s leading visual effects and motion capture experts who was part of the pioneering never-seen-before motion capture techniques with James Cameron on the epic blockbuster AVATAR, joined the effort and started scoping-out the project. Knight, the youngest member of the prestigious Academy of Motion Picture Arts and Science’s Sci-Tech Committee, was looking for something different — different than the typical 16-hour-day grind required the blockbuster hits he’s worked on, including The Amazing Spiderman and Hulk.


“When I first heard about the Wright Brothers project, I was immediately interested,” says Knight. “If I wasn’t working on films, I’d be teaching history at university — and this was a perfect opportunity to blend both.” A die-hard British soccer fan (football, more properly), Knight’s distinct Burberry’s uniform and High Street accent were a regular at weekend matches. There he met another die-hard soccer fan and ex-pat living in Los Angeles — Roy Taylor, an executive with computer processor and graphics maker AMD.


Taylor frequently scours the Los Angeles scene for talented virtual reality developers and unique ideas. He was immediately interested in the Wright Brother’s project, and suggested something even better than an IMAX film: a “virtual reality” experience.


“IMAX films give the viewer an ‘immersive’ entertainment experience, but virtual reality — VR — enables a feeling of realistic ‘presence’, of actually being inside the experience,” says Knight. After a technical discussion while lunching at a local Hollywood hotspot, Taylor arranged to have an Oculus DK2 VR headset and a very high-powered notebook PC sent to Hirsch so he could experience firsthand what this virtual reality buzz was all about.


Knight set up the VR demo in the conference room, and Hirsch donned the VR headset. It only took about 30 seconds. Hirsch pulled off the headset and said, “OK — we’re doing virtual reality now.”

Working with Crouch and Jakab at the Smithsonian, Hirsch and his team have attempted to re-create the most accurate two minutes of history ever viewed in a virtual experience — right down to minute details of the aircraft, including which side of the battery the ground cable is connected — with every detail thoroughly researched and reproduced.


“Never before in the history of cinema has six months of production been devoted to creating two-minute piece,” says Hirsch. “Even Spielberg takes small liberties on his historical films for the sake of a great story — but we could not, as we had one mission, and one mission only:  to recreate history as it was.”


Viewers of the piece will be able to able to walk around, pause, rewind, and see the historic flight from different vantage points on the sand dunes of Kitty Hawk, where it all began.


If only Wilbur and Orville Wright were alive to see this virtual reality re-creation of their first flight. They might conclude that the future has no need of an airplane, because virtual reality enables everyone to visit with each other in virtual space (though making that a reality is still years away). Hirsch and his team are busy working on their next historical piece, and plan to create a series of historical events that are best suited for the VR experience.


When asked whether he plans on still pursuing the IMAX or feature film, Hirsch answers, “Absolutely,” but then pauses and smiles, “The ability recreate history in a virtual world is something that has never been done before, and is much more exciting to me. What better story to recreate and first experience than the Wright Brother’s historic flight.”


“First: The Story of Wilbur and Orville” will be available for viewing at E3 for people attending AMD’s Fiji card launch event on June 16, 2015 at the Belasco Theatre in downtown Los Angeles.


PS: The name of the Wright Brother’s plane – the name nobody knows — was simply called the “1903 Flyer”.


Sasa Marinkovic is Head of VR Marketing for AMD. His postings are his own opinions and may not represent AMD’s positions, strategies or opinions. Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied.

The game creators at Crytek are avid storytellers who are practicing their craft as pioneers in VR gaming. They have always been driven by the desire to produce the most advanced visual gaming experiences, regardless of the format, which can be witnessed in the versatility of their industry-renowned game development engine, CRYENGINE. They saw the massive opportunity that VR presents early on, and set out to build a framework for exploring not only the technical challenges of VR, but also the creative ones. “We see the potential of a new media and immediately start imagining stories that we can tell with it,” explains CRYENGINE Creative Director Frank Vitz, “That leads us to the development of software methods, animation tools and rendering modes that feed back into the storytelling, and, the next thing you know, we have what already feels like a full game. It’s ‘just a tech demo’ but it feels like a glimpse into a whole new world. That is what inspires us. When we see people’s reactions to the demo, their thirst for more, well, we know that we are on to something big.”


Crytek’s first VR gaming foray is a mammoth undertaking – to recreate a prehistoric Earth, replete with dinosaurs and Jurassic vegetation and terrains, and fully immerse players within this bygone world. “People are always looking for something new, an experience that is outside the scope of what is possible in their everyday life,” says Crytek Executive Producer Elijah Dorman Freeman, “Dinosaur Island was an obvious choice for us as a stage for VR; who hasn’t imagined what it would be like to live in a world populated by dinosaurs? This is a meme that has been resurfacing periodically since the first scientific discovery of dinosaur bones almost 200 years ago.  And that is just one example – we will see space stories, historical recreations, underwater adventures; VR offers us a portal into worlds of infinite possibilities, worlds just waiting to be imagined and explored....”


It is obvious that Crytek is serious about the potential of VR entertainment. The company has two teams working on VR innovations: One is a dedicated demo team that is organized like a game production studio. They are exploring the experience space of virtual reality, solving navigation problems, inventing new ways of interacting with virtual worlds, figuring out how to tell a story and make it fun. Freeman is the Executive Producer on this team.


The other group is the CRYENGINE development team, led by Frank Vitz. They are responsible for the integration of VR technology into the engine itself. The Oculus SDK, for example, is one software interface that Crytek has integrated, which gives their developers using CRYENGINE direct access to the Rift Head Mounted Display and various rendering parameters.

“What both teams learn and develop is being integrated into CRYENGINE to be shared with our licensees,” notes Vitz.


A lot of the traditional rules and techniques of game development just don’t apply to the new space of VR, so Crytek’s teams must find novel and inventive equivalents as they navigate the Wild West-like atmosphere of the VR landscape.


For example, the traditional rules of cinematic composition don’t apply when you are free to look in any direction. This changes the role of the director. He or she has to focus on the situation, the 3D arrangement of the scene, invent new ways to attract the player’s attention.


The most obvious and compelling aspect of a good VR experience is the sense of “presence”, the feeling that you are in a real place, not looking at a projection on a screen or a monitor. As Vitz expounds, “We believe that a really great VR experience will make the player feel that they are not just observers, but also part of that world, that they can interact with it, that the things they do may change the course of events in the world.” “And, most compelling,” Freeman continues, “is that the world is aware of them; creatures react, characters look at them and what they do matters. Of course this is true of traditional video games as well, but the immersion of VR makes it all the more compelling.”


Delivering on this promise of a totally immersive and completely compelling VR experience is a tall order, but Crytek has already received an overwhelming response from developers who have seen what CRYENGINE can do for VR and want it for their projects. Dario Sancho and Valerio Guagliumi are the two main rendering software engineers who have been leading the charge for a VR-optimized CRYENGINE.  Sancho enthuses, “Our experience in stereoscopic 3D, combined with the engine's rendering power and ability to deliver high resolution images at a high frame rate, means CRYENGINE offers a degree of visual fidelity that many people feel is foundational to a compelling VR experience. We can't wait to see what other CRYENGINE users create now that they have VR capabilities at their fingertips.”


Freeman expands on Sancho’s enthusiasm and describes how Crytek’s early VR adventurers are at the forefront of boundless VR discoveries: “The growth in the VR industry in the past two years has been explosive, and this is just the first glimpse of the possibilities. There is a sense in the VR community that we are explorers, going out in all directions, building new ships, and bringing back tales of distant lands. And we see this period of expansion continuing exponentially. It’s going to change the whole nature of games and interactive entertainment. This has us really excited, because Crytek is right at the epicenter.”


Hardware manufacturers have a huge role in accelerating the technological advancement necessary for VR and enabling the ‘presence’ that VR developers require. After all, the ability to present compelling worlds is fundamentally driven by the power of the modern GPU and its ability to render high fidelity imagery at high frame rates with low latency. AMD understands the challenges and they have taken a bold direction with their LiquidVR initiative.  Valerio Guagliumi confirms, “We are integrating LiquidVR directly into CRYENGINE, which will allow us to take advantage of its innovations in low latency head tracking, scalable multi-GPU rendering, VR device compatibility and support for exotic peripherals.”


As Crytek crafts the future’s virtual worlds, Vitz concludes by describing the eagerness of his CRYENGINE team to collaborate with AMD on bringing to fruition the limitless possibilities of VR: “LiquidVR is going to solve many of the problems of ‘presence’ allowing us to focus on delivering the power of CRYENGINE to build worlds and tell stories.”


Sasa Marinkovic is Head of VR Marketing for AMD. His postings are his own opinions and may not represent AMD’s positions, strategies or opinions. Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied.

We understand less about the human brain than any other organ of the body. However, this may soon change. GE scientists, working with top researchers and institutions around the world, are working to unlock new insights about the brain previously not possible. New tools and technologies that enable visualizing brain anatomy and function can give researchers new insights and a better understanding of treatments needed for brain illness or injury.


Among of the best visualization tools available today involves virtual reality (VR). VR refers to computer-generated “virtual” environments that simulate your physical presence within a software-created world, delivering immersive and lifelike experiences that seem quite real. Already beginning to revolutionize PC gaming, VR is poised to dramatically transform an array of applications and industries, including entertainment, business, education, communications, training, psychotherapy — and medical research.


“Visualizing the human brain in intricate detail in virtual reality is no easy feat,” notes Katrina Craigwell, Director of Global Content and Programming at GE. “It takes an incredible amount of computational horsepower to create the virtual world and ensure that you’re completely comfortable interacting within in. To power the experience, we’re using AMD’s next-generation GPUs which not only deliver the exception graphical detail we want, but also a number of optimizations via their LiquidVRTM technology that ensures that the virtual world feels every bit as responsive and natural as the real world.”


Combining the latest VR technologies with powerful AMD computing and graphics capabilities, GE scientists have created a virtual portal into the human brain, enabling anybody to enter, view, and explore the brain in ways never before possible. The Neuro VR Experience enables an interactive understanding of how GE scientists are breaking new ground in understanding how the brain works.


In GE’s Neuro VR Experience, viewers are virtually transported into Reuben Wu’s brain and are introduced to the inner workings of his mind – the interplay of billions of synaptic responses to sense and thought stimuli. This software-generated representation of Reuben’s mind simulates a physical presence within the universe of thoughts, desires, hopes and dreams contained within his brain.


Reuben is no stranger to the surreal. As an industrial designer, photographer, DJ, keyboardist and songwriter for the electro-synthpop band, Ladytron, the Liverpudlian artist captures and creates mind-bending auditory and visual experiences. His work not only stimulates the senses and elicits emotions, but also fuels new understanding of the grandeur of our natural environment and the imagining of what is possible. He has a beautiful mind, one that all will soon be able to explore through the latest in VR experiences.


As participants marvel at the intricate tapestry of intelligence, behavioral, sense and motion control centers that compose the masterpiece of the mind, they will also learn about the pivotal GE innovations that are propelling research into, and understanding of, the human brain. Enhanced visualization techniques, such as higher resolution and mobile magnetic resolution imaging (MRI) scanners, and integrated mechanisms to better analyze the volumes of data collected from brain research, are expediting the pursuit for cures to neurological diseases and disorders.


Offering a tantalizing virtual glimpse into the medical imaging and visualization advancements that will soon be reality, GE’s Neuro VR Experience immerses participants in a unique and visually stunning interactive journey through the wonder that is the human brain – the source of all qualities that define our humanity.






Sasa Marinkovic is Head of VR Marketing for AMD. His postings are his own opinions and may not represent AMD’s positions, strategies or opinions. Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied.

Three billion personal computers in use today consume more than 1% of energy production, and 30 million computer servers use an added 1.5% of global electricity generation. And it's not just computers using all of this power. The explosion of smartphones, tablets and the other digitally enabled devices - the so called "Internet of Things" - is causing all of those numbers to escalate. By 2020, the estimate is that there will be 50 billion connected devices - about seven devices for every person on the planet today - that are forecasted to consume 14% of global electricity generation.

So, with this large and growing power demand, is the digital revolution helping or hurting efforts to reduce greenhouse gasses? A panel discussion at the recent Boston College Corporate Citizenship Conference dug into this question. The panel featured Sam Naffziger, Corporate and IEEE Fellow from AMD; Dr. Neal Elliott at ACEEE; Chris Lloyd of Verizon; and Dr. Michael Webber from the University of Texas. Titled "The Future Is Energy Efficiency: How the Digital Revolution Affects Sustainability," this discussion explored the sustainability implications of the technology revolution and the trend lines that will impact the future.

Obviously, conserving energy is an important issue for the technology industry. By saving energy, we can help reduce costs, preserve natural resources and mitigate the climate impacts associated with energy production and use. Case in point: About a year ago AMD announced a goal to improve the energy efficiency of its mobile processors by 25 times by the year 2020 from a 2014 baseline. It's an ambitious goal but one worth aiming for. Using a car analogy, this rate of improvement would be like turning a 100-horsepower car that gets 30 miles per gallon into a 500-horsepower car that gets 150 miles per gallon in only six years.

That's pretty significant. Put another way, if all laptops in use in 2020 matched AMD's energy efficiency goal, the annual energy savings could amount to 18.4 billion kilowatt hours. That's equivalent to the output from 3.3 coal-fired power plants, which is enough to supply 150% of the annual power needs for Washington D.C. And that's just for laptops. If similar efficiencies extended throughout the information and communications technology (ICT) industries, the savings would be compounded.

But there is an even bigger story: As we move into the era of the "Internet of Things," digitally enabled devices are being utilized in a myriad of ways that can save energy and benefit society. From new medical devices to distance learning technologies to connected thermostats, digitally enabled devices are helping to make our world smarter and more efficient. Just one example from the GeSI SMARTER 2020 study: Digitally enabled systems could cut greenhouse gas emissions 16.5% by 2020, resulting in $1.9 trillion savings in energy costs.

The ACEEE uses the term "intelligent efficiency" to characterize the savings that result from applying ICT to energy using systems, and their research shows that that these savings can exceed 15%. Of course ICT uses some energy to achieve these savings, but the research shows that intelligent efficiency saves between 10 to 20 times the energy it requires.

So, the message is twofold: Energy-efficient technology is essential for our digital future and these technologies can enable energy savings across the entire economy. It's a win-win situation: As more systems are enabled with energy-efficient digital technology, customers save money and we lighten our environmental footprint. As someone who has worked in environmental protection for more than 30 years, it's a rare and wonderful thing when the needs of an industry and the environment align.

Tim Mohin Director of Corporate Responsibility AMD