What do custom-made dresses for Dita Von Teese, military airfield runways, Ebola containment/patient care and museum displays have in common? Thinking … thinking … At as first glance, not much. But all were prominently featured at last week’s REAL 2015, a gathering to cross-pollinate the arts, design and engineering, historical preservation and industry around the themes of capturing, computing and creating reality. When Autodesk first floated the idea of hosting REAL, it sounded like it couldn’t possibly work —what, after all, could an aircraft maker learn from someone who designs urban installations made from hula hoops?— but it did. Stepping away from the daily to take a look up and around at innovative projects led to an incredible creative energy that permeated the venue. It’s not clear when Autodesk is going to host the next REAL event, but try to go. The return on your time investment will be many fold.
Capturing is probably the most well-known of the trio. You’ve seen surveyors on a local street, a Google Streetview vehicle and maybe your dentist has scanned your mouth for a crown. You many even have snapped away with your cellphone camera, so you know about capturing the reality that surrounds you. What you probably don’t know is how rapidly that technology is evolving and growing to encompass even more types of sensors. REAL speakers showed off laser scanners big and small, photogrammetry, heat-seeking, chemical, CAT/PET medical imaging and other types of sensors that enabled them to explore and “map” their projects.
One of the most creative and thought-provoking presenters was WHO?, dedicated to making the world’s historical artifacts available to more people than can make it to a museum — and to making more of the museum available to the public. Did you know that many museum can only show 2% of their collections? That means 98% of their artifacts are hidden away, not because they’re unimportant, but simply because there’s no space, or the space is inappropriate to safeguard the treasures. Prof. Sarah Kenderdine, of the University of New South Wales, has developed installations that let the public explore distant caves or huge collections from within a space (often portable) using scanned images combined with soundscapes and interactive media to make history come alive. That’s great, you say, but what’s the commercial application? How can my widget-making/cars/electronics/buildings/bridges benefit from something like this? By giving the consumer of your creation a glimpse into the manufacturing process and into how the product will look and feel before they actually have the object. Sit in a driver’s seat, test-fly a plane, feel (as much as is digitally possible) how the object will sit in your hand, explore a new home before it’s built — foster a better, tighter connection with your customers before committing to the actual manufacturing, before real money is spent to bring the product to physical life.
Drones were a huge part of the capture phenomenon at REAL 2015, especially as the FAA just released new operational guidelines. Nothing is clearcut, so check with an aviation attorney. Speakers had drones doing everything from automatically checking the state of a construction project at the end of the work day as prep for the next morning’s status meeting —it’s next to impossible to argue about whether something is complete when there’s visual confirmation and a model to walk through— to periodic flyovers to capture erosion data for environmental modeling. AEC industry attendees were looking for new business areas and many found them in new applications for drone-based sensing. As sensors continue to get smaller and lighter, and less power-hungry (and as regs around their use become clearer), many of the measuring tasks carried out today by humans will shift over to drones.
Creating, because of the hype around 3D printing/rapid prototyping/additive manufacturing, is also very much in the news these days. Ms. von Teese’s dress, the Ebola isolation cubes, art installations — all examples of making with new technology. But not all creating has to be physical, or based on emerging 3D technologies. One of the best sessions was on the first day, when Autodesk CEO Carl Bass (above) and artists Bill Kreysler and Bruce Beasley spoke about their creative processes, about movement and line, and how today’s technologies let them model digitally and in prototype form before committing materials and time to the full, final product. I’m no artist; I tend to color inside the lines and am freaked out by a blank piece of art paper. But I do cook, which Mr. Bass said is included in a broader definition of “maker”. Mr. Bass said that 30 million people go to Autodesk’s Instructables website each month, adding and referencing ibles (what the cool kids call the projects listed there) of all sorts, from cooking to crafting to electronics and 3D printing. The point: “We need to be a little less proud about letting computers help us design”, he said, since the future of making is a “weird mix of analog and digital.”
The compute part, the middle bit of capture/create/compute, is least understood but likely the most powerful driver of change in how we make. We can take laser scans and photos by the billions, but unless we can stitch them together and use them for some downstream purpose, it’s cool but kind of pointless. We run the risk of capturing gigabytes of the wrong data without an end-game in mind. REAL demonstrated both the current state of reality computing, leading to very realistic 3D-feeling scans, for example, and raised real questions about how far we can take this technology. Lance Filler, Airfield Damage Repair Program Manager for the US Air Force is charged with getting runways back into operation at military installations after an attack. Before planes can safely land or take off, the runways need to be repaired — but before that, a team must clear unexploded ordnance and then assess the damage to the runway. But before that, Mr. Filler’s team has just 30 minutes to count and map thousands of bombs, identify craters and nearly invisible penetrations into the Subsurface under the pavement, and provide assessments for the removal and repair teams. This is nowhere near my normal wheelhouse (or most of the audience’s, I’d wager) but you could see the wheels turning: laser, sonar, heat-seeking — what types of sensors? Drones, vehicle-based, line-of-sight from a platform? How to turn the raw data into something that knows this is a normal crack in the pavement but that is an unexploded bomb? How to map the data — GIS? Oh, and the installation can’t itself become a target, so must be mobile and quick to install. Gotta be cheap, too. No easy answers but once we get to the heart of these types of computing challenges, the worlds between digital and physical really will blur.
The projects showcased at REAL covered the range from the expected, such as using laser scans to check for panel alignment in concert halls, to the unexpected, like scanning coral reefs to do differencing over time. My key takeaways:
- No matter how cool the toy, you have to let the concept lead. Technology must be in service of the project, and not implemented for its own sake. This is true in reality capture, in the Internet of Things, in gathering data for predictive maintenance …. You can quickly become overwhelmed by all that you can do; stick to what creates real benefit. But
- If you’re too set in your ways, closed to new ideas, you need to be ready to lose to agile innovators. Business benefit is in the eyes of your customer, and one of these insanely creative entrepreneurs can find a maker space, borrow a garage, or leverage new technologies in a way that simply out-innovates traditional companies. Lumio was apparently born because an inventor was told that lighting wasn’t meant to be beautiful or portable in this way. No factory? No problem. But
- No one innovated alone. Reality computing seems to work best when engineers look at nature and art; when artists learn material science; when museum curators become virtual reality specialists, and so on. As a visual medium, it makes collaboration quick and easy and fosters conversations across disciplines and across industries that might never have happened otherwise.
Autodesk didn’t focus on their products at all during the event –no sales pitches except from the co-sponsors– but we did learn that Memento is now in public Beta and available for download by Windows users. Memento leverages cloud computing to create high quality meshes from photo or laser scan captures, readying the data for use additive or subtractive manufacturing, virtual reality and other workflows. The Mac version will be out in a couple of months. “Memento” literally means “keepsake” or “souvenir” but that’s not the only use for the product: designers can use it to create a starting point for further refinement, since the mesh can be exported to many of the 150 or so products Autodesk markets, and updated by CAD or simulated by FEA and CFD tools.
Whether you think this reality computing thing has legs or not, it’s undeniable that low-cost tools and compute horsepower plus some DIY investment of time and money, aided and abetted by the huge amount of expertise available via the Internet lets many more participate in connected exploration. We can work together to map the coral reefs, photograph important cultural relics before they’re destroyed or lost, and design public spaces that meet the needs of the community. That kind of cross-pollination is bound to lead to cool new ideas and business opportunities.
Note: Autodesk graciously covered some of the expenses associated with my participation at the event but did not in any way influence the content of this post.