The technology used to conceive, design and fabricate the objects around us is complicated. It may be difficult to understand if you're not a practitioner, yet businesses routinely entrust their most important processes to these tools. Our Hot Topics blog tries to clear up some of the confusion.
If you have comments or ideas for Hot Topics, please contact us.
ESI announced today that it is acquiring CIVITEC, maker of Pro-SiVIC, a platform for the modeling and simulation of sensors in what it calls “perception assistance systems”. You’ve seen the commercial for the car that parallel parks itself: how does it know where it sits in relation to the parking space? How far away other vehicles are, fore and aft? How hard to turn the steering wheel? All of that requires data, and then systems design: creating a 3D model of your car and surrounding cars/trees/etc, capturing your changing location with sensors, and connecting the computed desired action to electronics to mechanical systems that make it all happen. Pro-SiVIC enables designers to create these interactions, define and configure the sensors and produce simulated data to store or exchange with another application, such as the mechatronics that enable the wheels to turn.
CIVITEC was formed in 2009 as a spin-off of IFSTTAR, the French Institute of Science, Transport Technology and Network Development. By acquiring 80% of CIVITEC’s shares, ESI Group says it has the opportunity to commercialize CIVITEC technology, make the Pro-SiVIC development platform more industrially robust, and deliver it through its worldwide sales network. Details of the transaction were not announced, though the deal was financed by ESI’s credit facilities.
Alain de Rouvray, ESI’s CEO said in prepared remarks that “[t]his new expertise of assistance to human perception … provides opportunity to take into account the interactions of a vehicle, or any other industrial product, with its scalable immersive environment. Once integrated into digital 3D modeling, it will enable dramatically accelerated design and prototyping of embedded control and security systems and thereby strengthen the value of our global solutions in virtual prototyping.”
Further, M. de Rouvray says that the acquisition is “[c]learly strategic for ESI Group [and] amplifies our technological and commercial synergies in many sectors, including automotive, aerospace and more generally address the new societal challenges of the mobility and safety industries”.
If you drive a newish car, you likely have some form of Advanced Driver Assistance Systems (ADAS). The generic category includes everything from a backup camera that spawns beeps if you get too close to an object, to sensors that cause your headlights to turn on at dusk, to adaptive cruise control, automate braking and technology that keeps you in your lane. ADAS regulations are still evolving but ESI estimates that it can cost on the order of €10 million to €30 million for a vehicle manufacturer to prove that it meets active safety qualifications if it has to build prototypes to prove compliance; Pro-SiVIC enables ADAS simulation at a substantially lower cost. And, of course, as we get ever closer to an autonomous, driver-less vehicle, the need for this technology will only grow.
ESI says that CIVITEC is operationally profitable (likely meaning that the company reports a net loss that ESI can mitigate), and that the acquisition is expected to be accretive to ESI in the short term.
ESI was the acknowledged leader in crash simulation for many years and remains so in parts of the world and at the high-end; adding Pro-SiVIC to the mix enables it to market a more comprehensive passive and active safety technology design and simulation solution while also growing into embedded electronic systems. We all know that embedded systems are used across many industries, so this acquisition will help ESI strengthen its offering across the board.
Dashing to the airport (from warm and lovely San Diego, to a cold and possibly snowy Boston — what am I thinking?!) and just got an email announcing that Intergraph has acquired Ohmtech, makers of Visual Vessel Design, an analytical tool to design pressure vessels, shell and tube exchangers, and boilers. Visual Vessel Design (VVD) has been on the market for 30 years, with a strong presence in Europe and among vessel fabricators, refiners and engineering companies, and in both onshore and offshore.
Ohmtech says that VVD complements Intergraph’s PV Elite, Intergraph’s historical solution for vessel and heat exchanger design, analysis and evaluation. The companies see this as an opportunity to grow the adoption of PV Elite and VVD throughout the process, power, and marine industries.
Bjorn-Olav Ohm, founder of OhmTech, will join the Intergraph team in Stavanger, Norway as Technical Director for VVD.
Intergraph PPM CEO Gerhard Sallinger said that “the addition of Visual Vessel Design to Intergraph’s solution suite as a great opportunity to grow our presence in Europe. This acquisition will further strengthen and reiterate our mission to be a global leader in the process, power, and marine industries.”
I’m looking forward to learning more about how VVD complements PV Elite – it could be as simple as European code compliance or more complex, adding capabilities that PV Elite doesn’t currently possess. Will let you know what I find out.
ESI last week reported results for Q4 and the fiscal year ended January 31, 2015 that were surprisingly strong, with Q4 total revenue up 8% as reported to €49 million, boosting full year revenue to €110 million, up 1.5%. That’s quite a reversal, given that revenue through the first nine months had been down 3% year/year.
CEO Alain de Rouvray said in prepared remarks, “The final quarter of 2014 reveals
a positive sales dynamic partially overshadowed by transition effects affecting both Licenses and Services over the first three quarters … The year was also marked by a negative currency effect and a difficult political and economic context in BRIC countries. … [T]he success of our immersive virtual reality offering and the interest in our solutions enabling a response to environmental challenges such as air quality and renewable energy illustrate the solidity of our strategy and its diversification potential.”
M. de Rouvray is referring to the lackluster results earlier in the year because of currency effects and postponed deals, as well as a transition away from lower-valued services to those that are likely to lead to future license sales. ESI now wants to help customers change their product development strategies to include more virtual prototyping, which may require new business processes. That’s far more valuable, long-term, to both ESI and its clients does take time.
- License revenue in Q4 was €40.8 million, up 7% as reported year/year (y/y) and in constant currencies (cc)
- Services revenue in Q4 was €8.3 million, up 11% y/y and up 9% in cc
- ESI says it saw “buoyant sales momentum in Europe, notably France, and the solid sales growth recorded in the Americas.”
- For the year, license revenue was €83.3 million, up 3% y/y and up 4% in cc. The company says that repeat business fell a bit at cc, from 87% in 2013 to 86% in 2014. New licenses sold to new customers totaled €17 million, down €0.5 million from the year earlier, due to political and economic difficulties in BRIC countries, especially Russia and China.
- By geo, revenue from Europe was €54 million, up 8% on strong activity in France and Germany.
- Revenue from Asia was €39 million, down 2% due to currency and the difficult business climate in China.
- Revenue from the Americas was €19 million, down 9% due to the shift in focus of the Services business. ESI says that this “abandoning of certain non-strategic and lower margin services was not compensated by the increase in Licenses activity over the year.”
- Finally, revenue from BRIC countries was €14 million, y/y. Gains in Brazil and India could not offset declines in China and Russia.
That’s what we know now. ESI will release more details on April 16, 2015.
I get asked this a lot: Which is better, NX or CATIA? ANSYS or Nastran? Which Nastran? Onshape or Fusion 360? PDMS or SmartPlant? As with the last Q&A post, I see that search engines deposit you at Schnitger Corp as you try to discover what others recommend.
I wish there were a simple answer, but there isn’t. Most of the commercial products on the market today are at parity, in general, but each has specialized competences that follow its maker’s strategic direction.
So, how do you decide what’s best for you? Figure out where you are today and where you want to be in a few years, then jump in:
- What features do you really need for work you’re doing today? Don’t be distracted by shiny new things that you know, deep down, aren’t necessary for you.
- But do look at directions –yours and theirs– to make sure there’s some sort of alignment. If you’re thinking of branching into industrial design, look for a CAD tool that can connect preliminary designs into the detailed design process so that you don’t have to recreate. If you want to offer laser scan services to your AEC clients, look for a platform that can incorporate point cloud data. If you’re currently offering FEA services and want to branch into CFD, look at the specific types of simulations you’ll be doing — don’t just go generic and hope it works out.
- Price matters but shouldn’t be the only deciding factor. Most vendors and resellers have payment options that can help take cost off the table. If your needs are sporadic, however, be sure to look at the products that offer monthly subscriptions.
- Consider staffing. This is huge: having decided on a CAD (or CAE or CAM, to a lesser extent) product that no one in your area knows how to use isn’t helpful. Talk to local community colleges, partners, resellers and other businesses in your town about the talent available to you.
- Partners matter. A lot. Make sure that your supplier can support you –with training, customizations, installs, hiring, engineering services or whatever you can think of– so that you’re backstopped when you need to be. A good partner does a lot more than sling software boxes.
- Test. Trial. Do bakeoffs. We don’t see bakeoffs too often these days, and it’s a shame. Years ago, interest groups set up evenings where experts would put software package A head to head against B, creating the same model or performing the same tasks while attendees watched. Ease of use, intuitive modeling, help functions, speed — it was like watching a car race from inside the car. Which gets to the end fastest, and with the most ease? If that’s not available to you, ask for trail licenses. Even better, pay for a few to see what kind of support you get. Try to do your job with each solution, recognizing that your lack of expertise will be frustrating and make you slow — but you’ll learn a great deal about your options. If you’re in the market for CAE, also verify your results against physical tests and check against other, known results.
- Just do it. After a certain point, you simply need to decide. Putting it off won’t make the decision any easier and it’s unlikely that unknowns will become knowns. Software is an important part of what you do, but it’s an enabling tool and not the whole magilla. Odds are that you’ll be fine; you’ve considered your options, found a reliable partner — now go!
I know you want someone to tell you that product A is better than B, but it’s not that simple. There are more choices than ever before so the decision process is more about defining your needs and finding a fit that’s specific to you than it is about finding the one absolute best out there. Good luck, and chat back in the comments!
It’s ASNE time again, when the US naval engineering community gets together to figure out how to better spend the scarce resource that is taxpayer money in the defense of the nation. I wrote last year about the dynamics of the US shipbuilding industry—mostly military, a stressful balance of military, political and economic aims—and much of that remains unchanged. We’re getting closer on a lot of critical issues, like making procurement decisions based on a lifecycle view that incorporates construction and operations trade-offs; improving public/private cooperation and collaboration; and, fundamentally, keeping available and operational a fleet of assets that might have to last 50 years. These are long-term problems that won’t be easy to fix. What was most heartening at the American Society of Naval Engineers’ ASNE Day 2015 was progress we seem to be making towards more affordable, flexible solutions that address both military and industrial issues.
Keeping the fleet at operational availability
Did you know that the US Navy submits a report to Congress every year, detailing its view of what’s needed for the next 30 years? The plan lays out new construction objectives for the 300-odd ship fleet the Department of Defense believes it needs to meet defensive and offensive needs.
Thirty years ago was 1985: the Berlin Wall was still up, Madonna’s “Like a Virgin” topped the Billboard charts and NASA flew 9 Space Shuttle missions. Fast forward: no Berlin Wall but a storm of militant insurgencies, “”Uptown Funk!” by Mark Ronson/Bruno Mars is #1 and the Apple wristwatch the hot thing, while our space missions go nowhere. Even the best minds in the Navy would have been hard-pressed to reconcile the world of 1989 with today’s much harsher and technological reality. (No comment on Madonna vs. Bruno Mars.)
And there’s the problem: the world changes so quickly that Navy assets are hard-pressed to keep up. Electronics, weapons systems, propulsion efficiency, stealth technologies, cyber-warfare defenses … All require upgrades that are hard to predict and schedule across the fleet, many of which are 20-30 years old. If you’ve ever tried to retrofit cables behind a wall in your house, you understand the problem — now expand that to miles of cabling on an aircraft carrier, and you’ll start to see the scope of the problem. (Wifi? Cyber-warfare issues.)
According to the 2014 (FY2015) 30-year report, the current battle force count is 289 (out of a required 306 by 2020). That 306 ship fleet breaks down to 60 submarines, 11 aircraft carriers, 88 large multi-mission surface combatants, 52 small multi-role surface combatants, 33 amphibious landing ships, 29 combat logistics force ships and 33 support vessels. To reach the 306 ship total, the Navy says it needs to add between 5 and 13 ships per year, every year from now until 2044 to maintain readiness. (We need to keep adding because ships are also being taken out of service: 14 ships in FY15, 7 in FY16, 6 in FY17 and so on — and, of course, all of this presumes no ships are lost outside of this plan.)
According to the Congressional Research Service, the 2014 plan “projects that the fleet would experience a shortfall in amphibious ships from FY2015 through FY2017, a shortfall in small surface combatants from FY2015 through FY2027, and a shortfall in attack submarines from FY2025 through FY2034.” Why build and build and still fall short, you may ask? Because it takes a long time to spec out and design these ships, award their construction, source all of the equipment, etc. — and the Congressional budget process keeps stopping and starting these projects. Sequestration wreaks havoc on the Navy’s progress and gives very little impetus to commercial designers and shipyards to invest in technology, infrastructure and people in advance of receiving the specific order. (It creates other problems, too.)
The Navy says the 30 year plan would cost “an average of about $16.7 billion per year in constant FY2014 dollars to implement.” The CRS thinks it would be roughly 13% to 20% more expensive, depending on how you look at inflation.
That’s a lot of money. One way to stretch those dollars is to look at the fleet as a whole, rather than as a series of individual ship programs. I learned that the Navy operates something like 50 different types of ships and aircraft, each with a service life of 20 to 50 years. Every year, the Navy decides how to replace two-ish types/classes of ship or aircraft that need to be retired — the natural starting point is always to go for an exact replacement, but with the latest revs of everything to meet new challenges. What if, instead, we were to look at the fleet as a system and figure out how to keep it at peak operational efficiency? Across all assets, at all phases of their lifespans? That’s incredibly had, but it’s where the smart thinking is going and it can’t happen fast enough.
Flexing to stretch further
Another way to stretch those dollars is to look at more flexible designs. In a perfect world, ships could be reconfigured as needed, for military missions against an identified enemy (think Cold War) or point conflict as we see today, but also for rescue and humanitarian missions when that need arises. Flexibility only goes so far, of course: you can’t turn an aircraft carrier into a submarine. And that’s the first, key decision in trying to make a more flexible fleet: what type of flexibility is really needed, and what are the bounds of that flexibility?
At ASNE Day 2015 we learned that the US is looking closely at the Danish style of frigate, called StanFlex. This is a modular mission payload system, where payloads may be weapons systems or something else. One Admiral said that StanFlex allowed the Danish Navy to put legacy weapons systems on the new frigate rather than having to develop and build new systems, as is typically done. But perhaps more importantly, one could swap out a non-functional system for a working one in a couple of days, without having to take the ship out of service or wait for a significant overhaul, saving time in port and returning the ship to availability more quickly.
This flexibility could extend into many areas of vessel. For example, defining zones (“margins” in Navy-speak) would ensure access to critical components, making it easier to swap heating and cooling systems, for example, or isolate systems to make software upgrades possible. Small, incremental changes rather than big, extensive overhauls. Days rather than months or years.
This is a cultural shift in the way the Navy thinks and plans — but, from a PLM perspective, it’s something we can already relate to. It comes down to managing data: which system, which rev, which components on each specific hull number? What does it need, when does it need it? How to schedule that change (and perhaps one or two others) in a port that can handle it? What to do now, and what to defer until a schedule maintenance session? If we think of ships as systems and subsystems and map bills of material or process, we can do this now, technologically. But it’s very hard to do, operationally (and we don’t seem to have the starting data for a lot of the older ships), and that’s where things get sticky.
This requires a fundamental change in the way ships are designed. Naval engineers need to build this flexibility into the design from the very start of a program. One attendee showed me how the Littoral Combat Ship has zones into which specific types of modules can be inserted (weapons in these two, but not these four — think Lego block assemblies, rectangular vs square). Not everything is interchangeable but everything is interrelated — if you use too much power for Module 1, you won’t be able to operate Module 3 without adding a generator in Module 6. That becomes an operations geek’s dream: optimizing each hull for a specific mission, given all of the parameters of available modules, weight, power, etc. Complicated but also completely doable if you have as-built information for the ships and the details for each module.
Flexible ships was also the theme of this year’s Global Executive Shipbuilding Summit, a meeting sponsored by Siemens PLM that runs alongside ASNE Day, and will continue to be a focus for 2015 for the GSES VI working group. Uniformed military, civilian procurement officials, industry and a couple of ringers like me brainstormed approaches to the concept, from organizational change and strong leadership to approaches to costing out the alternatives. I’d say there were a hundred people in the room; 30 of them want to keep working on the problem and report back at next year’s ASNE Days and GSES — that’s how important this is.
One retired Navy guy said it best: Our ships are compromises. Between the Navy and Congress. Between the operations and maintenance teams and the engineers. Between the people who design systems that are too complicated and the sailors who have to operate them. We often take too long to get all of these stakeholders on board because the ships are so expensive no one wants to make a mistake. But our biggest problem is making all of these decisions in isolation, without looking at the whole: the industrial base is losing qualified people because the work isn’t steady enough. Talented procurement officers are leaving public service because sequestration makes them feel undervalued. We sometimes seem to have too many of one type of ship and not enough of another. We need to consider our force as a whole, look at our industrial capability in its entirety and get them all working together.
That’s a paraphrase. I couldn’t write fast enough to get it all down. His frustration, though, was very clear. A flexible ship, well-designed, affordable, and manned by competent crews, is the answer to a lot of what’s ailing US Navy. It’s good for commercial enterprise too, opening it up to smaller, more agile competitors at the system/module level. More on that in another post. But
Siemens PLM graciously covered some of the expenses associated with my participation at the event but did not in any way influence the content of this post.
Image is of the Ohio-class ballistic missile submarine U.S.S. Rhode Island. Photo by Mass Communication Specialist 1st Class James Kimber, U.S. Navy via the Congressional Budget Office report on the Navy’s Fiscal Year 2015 Shipbuilding Plan.
AEC acquisitions have been all the rage so far this year — and Trimble keeps the ball rolling with today’s announcement that it’s bought Fifth Element (no, that was a movie; this is forestry), based in Finland. Financial terms were not disclosed.
Fifth Element’s LogForce, available now, is a planning and logistics solution while WoodForce, out later this year, will connect harvesters with inventory locations to help make sure that haulers are traveling empty as rarely as possible.
Trimble’s Connected Forest, like the Connected Farm, aims to use technology to manage the full lifecycle of the resource, applying industrial logistics to planning, planting, growing and harvesting. We think of forests as tranquil places, and they are, but most are also working farms, carefully managed to produce a specific harvest. Trimble’s enterprise forest management solutions turn this art and science into business processes that analyze options, determine priorities, optimize resource allocation, and track progress against a number of targets.
If you’ve ever been in Norther Maine (or Vermont or New Hampshire or Washington state …) you’ve seen logging operations and perhaps pulled off the road to let a logging truck scream past at impossible speeds. Loggers are on a hillside, cutting trees and piling them up, waiting for transport to a pulp and paper mill or harbor. That brute force process is increasingly automated, with location-enabled equipment that can target specific trees (or stands of trees) for cutting at a precise time. Forestry can be risky business; engineers must balance the sustainability of timber and watersheds with the need for harvesting and recreational use. In many parts of the US (certainly in Northern New England), it’s a hugely contested topic and foresters must be able to submit economic models, environmental impact statements and other materials to build their case for continued access to these lands.
Ken Moen, general manager of Trimble’s Forestry Division, said in the press release announcing the acquisition that “adding Fifth Element solutions to Trimble’s forestry portfolio, we can better address operational forestry challenges around the world … Our fundamental focus is to provide solutions that drive integration of business data, improve efficiency and provide better visibility into forest operations to maximize productivity and profitability.”
I wasn’t aware of Fifth Element before today, but am told that the company sprang up out of the need for forestry companies to systematize working practices, automate and then put into the cloud for more ready accessibility in remote forestry sites. It’s another proof point of how mobile technology and location-based information can transform even very traditional businesses.
Onshape, the latest CAD company founded by Jon Hirschtick, John McEleney and other names you likely recognize, took the wraps off this morning to finally let us talk about the worst-kept secret in the CAD world. To skip right to the part you probably care about: What is it? It’s CAD-in-the-cloud with no local install, free unless you want your designs to be private, has nifty data management and collaboration capabilities built in, and is so easy to use that even I can do it. Now in public Beta, you can request access here.
Expectations are incredibly high. What else could they be when the likes of Hirschtick, McEleney, my CV colleagues Scott Harris and Dave Corcoran, and other CAD veterans start a CAD company. But then they team up with recognized experts in cloud, mobile and security, and bring in Harvard’s CFO and people take notice.
According to Fortune, “Onshape’s goal is to modernize CAD software. “The vision of the company is that everyone on the design team is able to use CAD together, on any computing device, anywhere,” Hirschtick says.” That vision enabled Onshape to raise $64 million and values the company $295 million, including the funding. Crazy for a company with one product that’s still in Beta, no? Expectations …
The vision does seem to be on its way to realization. Joe Dunne (another ex-CV, ex-SolidWorks guy) gave me a demo last month and made it possible for me to try Onshape myself. It’s typical CAD –insert shapes, modify them, fillet/boss/etc.– with a couple of added twists:
- Onshape runs in your browser with no local install. The user interface is simple, clean and uncluttered
- You create parts and assemblies, and use them to create drawings (though the drafting elements are not fully implemented yet)
- Your parts are stored at Onshape where a PDM-like container structure creates version and access control. No checking in/out, no locked files, no local copies; your entire team is working on the part at the same time and Onshape’s secret sauce lets everyone see what’s going on in real time
- Because it runs in a browser, you can use Onshape on Macs, PCs and mobile devices (am told Linux, too, but haven’t tried that). Playing with a design on my Mac and iPhone 6 at the same time was easy and fast (though a bit weird — will we really try to do CAD on that tiny screen?)
- The collaboration is natural. I can pull on a feature while you fillet the surface next to it, if that’s how we want to work. Whoever “sets” the design first has their changes communicated to the rest of the team. I haven’t run across that work process, but perhaps that’s because the tools haven’t encouraged it –I’m used to working on one aspect of a project while others work on theirs– but am interested to see how people who aren’t constrained this way will use it
- Since designs rarely start from scratch or operate in a vacuum, you can import and export common CAD formats. Onshape uses Parasolid, so expect the smoothest transactions with other Parasolids-based modelers
- How much does it cost? It depends. There’s a free version, but you can only have 5 private designs; the rest are visible to others. If you want your designs to be private, that’s $100/month (though there are apparently enterprise deals as well)
- The viral aspect to Onshape is interesting, too. Working on a design? You can notify collaborators by asking Onshape to send them an email that invites them to join Onshape. Presumably, once they try it, they’ll like it and want to sign up for one of the premium accounts
But whether you’re using the free or premium version, this is real CAD. And it’s the same version regardless of payment model: no dumbed down or less capable versions. For a beta, it feels remarkably stable. The business model is interesting, too: free or $100/month is going to get a lot of people to try Onshape, and the invitation mechanism will get them to sell it to their friends and collaborators. It’s not trying to replace corporate CAD but coexist with it, at least for now. It appears to be a “try it, you’ll like it” model, with the potential to be insidious: an IT manager might find dozens of Onshape users in a SolidWorks/Creo/Solid Edge environment, forcing a keep/let go decision. Someday.
A Boston Globe article yesterday pointed out something I hadn’t realized: the average age of Onshape’s founders is over 50. Can so seasoned a team create start-up magic again? Are they able they step back from what they’ve done in the past (impressive as it is) and come up with a new and novel solution to a problem that really hasn’t changed since SolidWorks debuted 20 years ago? It seems to me that they really did start from scratch, looking at what designers need today, how they like to work and what devices they want to use. Yes, it has to have the essential CAD functionality we all expect but in addition,
- Eliminating software download/install removes the possibility of version conflicts (and those pesky IT people getting in the way).
- Storing the parts in the cloud gives everyone access; controlling that access becomes key and is handled by a relatively simple Google-docs like permission model. Backed by, I am told, world-class security protocols.
- Building in collaboration creates a different way of working, more naturally, with remote partners and customers.
It’s an impressive start.
Image courtesy of Onshape.
Trimble today announced that it has acquired Linear project GMBH, a privately-held provider of scheduling software for linear or corridor infrastructure projects –that’s highways, railways, pipelines, tunnels, transmission lines and similar built assets that need to include location for a complete definition. These projects need to asses and work with site geometry, schedules and resources to plan and execute cost-effectively; staging material, for example, too far from where it is needed wastes time and money.
Linear project’s TILOS is a time and location planning tool that merges both place and schedule into a single graphical view that reflects the current state of each, and then updates dynamically as conditions change. It’s an extension to Gannt charts that’s hard for non-experts to interpret, but it enables contractors, owners and civil engineers to plan and manage linear projects more effectively.
Alan Sharp, business area director for Trimble Heavy Civil Construction software solutions, said in a press release that “[w]ith the addition of TILOS, Trimble can better address the needs of project owners, engineers and civil contractors by enabling them to closely manage operations and execute their projects on schedule and within budget. The Linear project team brings a wealth of experience and an expansive customer base that includes many of the world’s largest contractors and rail network operators. We are very excited to have them join Trimble.”
Details of the transaction were not released.
Well, well, well. Another independent, gobbled up. FARO, maker of laser scanners for metrology (very tight tolerances) and infrastructure applications (huge data sets, accurate, often outdoors) just acquired kubit, a German company that has been a leader in developing hardware-agnostic software to process point clouds. You may have heard of kubit’s PointSense Plant, which (semi)automatically processes raw point cloud data into pipe runs and other plant elements using pattern recognition. “Walk the run” means the software takes the user along a length of pipe; identifies pipes, tees, reducers and in-line fittings and then makes suggestions to the user on which to use based on catalogues that are identified for the plant/project. I’m not sure if kubit invented “Walk the run”, but it’s a common term and technique point cloud processing gets ever more automated.
The acquisition makes sense. FARO has been buying software companies that can add value (and workflows for people not used to using software) to the point cloud data — most recently with several accident scene recreation tools. Of the FARO deal, CEO Jay Freeland said, “[t]he acquisition of kubit is an exciting step in FARO’s strategy to develop integrated, disruptive 3D documentation product offerings for the Architecture, Engineering and Construction market. By adding kubit’s products to our portfolio, customers now have significantly enhanced software options to serve a vast array of point cloud modeling, analysis needs, and measurement capabilities with very high connectivity to the Autodesk suite of products.”
The acquisition includes substantially all of the assets of kubit’s U.S. distributor kubit USA, Inc. The price was not disclosed, though FARO did say that it was an all cash transaction that includes an initial payment as well as future earnout payments.
What do custom-made dresses for Dita Von Teese, military airfield runways, Ebola containment/patient care and museum displays have in common? Thinking … thinking … At as first glance, not much. But all were prominently featured at last week’s REAL 2015, a gathering to cross-pollinate the arts, design and engineering, historical preservation and industry around the themes of capturing, computing and creating reality. When Autodesk first floated the idea of hosting REAL, it sounded like it couldn’t possibly work —what, after all, could an aircraft maker learn from someone who designs urban installations made from hula hoops?— but it did. Stepping away from the daily to take a look up and around at innovative projects led to an incredible creative energy that permeated the venue. It’s not clear when Autodesk is going to host the next REAL event, but try to go. The return on your time investment will be many fold.
Capturing is probably the most well-known of the trio. You’ve seen surveyors on a local street, a Google Streetview vehicle and maybe your dentist has scanned your mouth for a crown. You many even have snapped away with your cellphone camera, so you know about capturing the reality that surrounds you. What you probably don’t know is how rapidly that technology is evolving and growing to encompass even more types of sensors. REAL speakers showed off laser scanners big and small, photogrammetry, heat-seeking, chemical, CAT/PET medical imaging and other types of sensors that enabled them to explore and “map” their projects.
One of the most creative and thought-provoking presenters was WHO?, dedicated to making the world’s historical artifacts available to more people than can make it to a museum — and to making more of the museum available to the public. Did you know that many museum can only show 2% of their collections? That means 98% of their artifacts are hidden away, not because they’re unimportant, but simply because there’s no space, or the space is inappropriate to safeguard the treasures. Prof. Sarah Kenderdine, of the University of New South Wales, has developed installations that let the public explore distant caves or huge collections from within a space (often portable) using scanned images combined with soundscapes and interactive media to make history come alive. That’s great, you say, but what’s the commercial application? How can my widget-making/cars/electronics/buildings/bridges benefit from something like this? By giving the consumer of your creation a glimpse into the manufacturing process and into how the product will look and feel before they actually have the object. Sit in a driver’s seat, test-fly a plane, feel (as much as is digitally possible) how the object will sit in your hand, explore a new home before it’s built — foster a better, tighter connection with your customers before committing to the actual manufacturing, before real money is spent to bring the product to physical life.
Drones were a huge part of the capture phenomenon at REAL 2015, especially as the FAA just released new operational guidelines. Nothing is clearcut, so check with an aviation attorney. Speakers had drones doing everything from automatically checking the state of a construction project at the end of the work day as prep for the next morning’s status meeting —it’s next to impossible to argue about whether something is complete when there’s visual confirmation and a model to walk through— to periodic flyovers to capture erosion data for environmental modeling. AEC industry attendees were looking for new business areas and many found them in new applications for drone-based sensing. As sensors continue to get smaller and lighter, and less power-hungry (and as regs around their use become clearer), many of the measuring tasks carried out today by humans will shift over to drones.
Creating, because of the hype around 3D printing/rapid prototyping/additive manufacturing, is also very much in the news these days. Ms. von Teese’s dress, the Ebola isolation cubes, art installations — all examples of making with new technology. But not all creating has to be physical, or based on emerging 3D technologies. One of the best sessions was on the first day, when Autodesk CEO Carl Bass (above) and artists Bill Kreysler and Bruce Beasley spoke about their creative processes, about movement and line, and how today’s technologies let them model digitally and in prototype form before committing materials and time to the full, final product. I’m no artist; I tend to color inside the lines and am freaked out by a blank piece of art paper. But I do cook, which Mr. Bass said is included in a broader definition of “maker”. Mr. Bass said that 30 million people go to Autodesk’s Instructables website each month, adding and referencing ibles (what the cool kids call the projects listed there) of all sorts, from cooking to crafting to electronics and 3D printing. The point: “We need to be a little less proud about letting computers help us design”, he said, since the future of making is a “weird mix of analog and digital.”
The compute part, the middle bit of capture/create/compute, is least understood but likely the most powerful driver of change in how we make. We can take laser scans and photos by the billions, but unless we can stitch them together and use them for some downstream purpose, it’s cool but kind of pointless. We run the risk of capturing gigabytes of the wrong data without an end-game in mind. REAL demonstrated both the current state of reality computing, leading to very realistic 3D-feeling scans, for example, and raised real questions about how far we can take this technology. Lance Filler, Airfield Damage Repair Program Manager for the US Air Force is charged with getting runways back into operation at military installations after an attack. Before planes can safely land or take off, the runways need to be repaired — but before that, a team must clear unexploded ordnance and then assess the damage to the runway. But before that, Mr. Filler’s team has just 30 minutes to count and map thousands of bombs, identify craters and nearly invisible penetrations into the Subsurface under the pavement, and provide assessments for the removal and repair teams. This is nowhere near my normal wheelhouse (or most of the audience’s, I’d wager) but you could see the wheels turning: laser, sonar, heat-seeking — what types of sensors? Drones, vehicle-based, line-of-sight from a platform? How to turn the raw data into something that knows this is a normal crack in the pavement but that is an unexploded bomb? How to map the data — GIS? Oh, and the installation can’t itself become a target, so must be mobile and quick to install. Gotta be cheap, too. No easy answers but once we get to the heart of these types of computing challenges, the worlds between digital and physical really will blur.
The projects showcased at REAL covered the range from the expected, such as using laser scans to check for panel alignment in concert halls, to the unexpected, like scanning coral reefs to do differencing over time. My key takeaways:
- No matter how cool the toy, you have to let the concept lead. Technology must be in service of the project, and not implemented for its own sake. This is true in reality capture, in the Internet of Things, in gathering data for predictive maintenance …. You can quickly become overwhelmed by all that you can do; stick to what creates real benefit. But
- If you’re too set in your ways, closed to new ideas, you need to be ready to lose to agile innovators. Business benefit is in the eyes of your customer, and one of these insanely creative entrepreneurs can find a maker space, borrow a garage, or leverage new technologies in a way that simply out-innovates traditional companies. Lumio was apparently born because an inventor was told that lighting wasn’t meant to be beautiful or portable in this way. No factory? No problem. But
- No one innovated alone. Reality computing seems to work best when engineers look at nature and art; when artists learn material science; when museum curators become virtual reality specialists, and so on. As a visual medium, it makes collaboration quick and easy and fosters conversations across disciplines and across industries that might never have happened otherwise.
Autodesk didn’t focus on their products at all during the event –no sales pitches except from the co-sponsors– but we did learn that Memento is now in public Beta and available for download by Windows users. Memento leverages cloud computing to create high quality meshes from photo or laser scan captures, readying the data for use additive or subtractive manufacturing, virtual reality and other workflows. The Mac version will be out in a couple of months. “Memento” literally means “keepsake” or “souvenir” but that’s not the only use for the product: designers can use it to create a starting point for further refinement, since the mesh can be exported to many of the 150 or so products Autodesk markets, and updated by CAD or simulated by FEA and CFD tools.
Whether you think this reality computing thing has legs or not, it’s undeniable that low-cost tools and compute horsepower plus some DIY investment of time and money, aided and abetted by the huge amount of expertise available via the Internet lets many more participate in connected exploration. We can work together to map the coral reefs, photograph important cultural relics before they’re destroyed or lost, and design public spaces that meet the needs of the community. That kind of cross-pollination is bound to lead to cool new ideas and business opportunities.
Note: Autodesk graciously covered some of the expenses associated with my participation at the event but did not in any way influence the content of this post.