Sunday, August 14, 2016

The future minefield of DirectShape Elements.

When Autodesk released the DirectShape API a few years back - I thought it was fantastic. Finally a way to be able to create arbitrary geometry that could live on within Revit - rather than only the elements that the API let us create in the ways that the API let us create them.

We used this to interesting effect in our own Scan to BIM product - providing a mechanism to capture irregular shapes as Revit geometry. What's also interesting - beyond the creation of geometry, the API allows you to assign these objects to be just about any category you want. Think about that for a moment - you can make any arbitrary piece of geometry, and declare that it is a wall, room, stairs, etc.
This felt incredibly cool but also incredibly dangerous to me, from a BIM perspective. And while assigning some characteristics of a wall come from assigning the wall category - it's not an actual wall element, and there are a lot of limitations there.

This Week
This week I experienced for the first time the down side of this approach. A customer was using one of our other tools on one of their models, and it was throwing a weird exception in our app. We were looking through the logs, and based on what we were seeing, it seemed like a model corruption issue. We had a method which retrieved all of the phases that had rooms assigned to that phase - and one of the rooms apparently had a phase which was not actually a phase in this model. How could this happen? maybe corruption? maybe some kind of issue with the cut-and-paste between models?

We started working on a workaround from that angle - but a day later, the customer was able to share the model with us. When we actually ran it via the debugger, we saw which room the phase was associated with: -1  (nothing).  This hadn't seemed possible in our experience. Then we looked closer - the element in question was not actually a Room - it was a DirectShape element. How the heck did that get in there?!?

The real problem in the code, which we had written stuff like this a thousand times over the past 6 or 7 years (since the FilteredElementCollector was invented), looked something like this:

FilteredElementCollector coll = new FilteredElementCollector( myDoc );
IList elements = coll.OfCategory( BuiltInCategory.OST_Rooms ).ToElements();

Historically, if all you wanted was the elements that were rooms, this was all that you needed.
That said - some API developer had created an addin that made a DirectShape box and assigned it to the Room category.

So - going forward, we as API developers can no longer rely on the category (or even Category/IsElementNotElementType) as reliable indicators of the .NET type in the Revit API.

Our quick fix for this particular issue was:

IList elements = 
   coll.OfCategory( BuiltInCategory.OST_Rooms ).OfClass( typeof(SpatialElement) ).ToList();

This would ignore any DirectShape elements that were showing up as rooms.  We quickly found other places (in other methods) where we had made the bad assumption, and had casting errors like:

IList rooms = 
  coll.OfCategory( BuiltInCategory.OST_Rooms).Cast().ToList();

The DirectShape elements failed the Cast at runtime.

Even when fixed, these issues are bound to get confusing. I believe in many cases the DirectShape elements may still schedule as their assigned category, so our customers will believe they have N rooms in the model, when in fact only some of them are "real" rooms.

I'm still wrapping my head around it, but I think ultimately you'll just always have to have an "OfClass(typeof(MyDesiredClass))" on the collector. If you don't, you're liable to get things you don't expect.

All-in-all, it's not too bad to address it, if you know about it. The key is that it's a loophole that other developers can open, and all of us who make software where you don't know what other addins have been used will have to make sure that we're double-careful about our assumptions. And I'm not looking forward to digging back through all the code I've ever written and am still supporting to find all of the bad assumptions I made.



Thursday, May 19, 2016

Autodesk Vault API 2017: 32-bit/64-bit challenges....

For end-users of Autodesk Vault - it's probably a welcome development that Autodesk Vault 2017 now supports a 64-bit version (and a 32-bit version if you're still running on a 32-bit OS).

Less so - however, for Vault developers, whose lives may get more complicated. The explanations in documentation and social media have been somewhat lacking (Where are you Doug Redmond? The citizens of Vault API land need you!).

Here's what I've managed to piece together so far:

The SDK
You'll see that the SDK ships with two subfolders of the "bin" folder - "x86" and "x64". The bulk of the DLLs inside of both folders are technically "ANYCPU (64-bit preferred)" - with the exception of the new Clic License Manager loader DLL, which seems to be platform-specific. (The Clic thing is something else that could use a bit more explaining than we've been given).

The Vault Explorer Environment
If you're doing Extensions within Vault Explorer, it's pretty straightforward: If you install on a 64-bit machine, you get the 64-bit versions. The documentation says that you'll have to make sure your DLLs match the bit-ness of Vault Explorer... But I believe that it IS possible to use ANYCPU for your explorer customization DLLs (they should match the Vault Explorer automatically, and all of the referenced DLLs are ANYCPU).  So that's not really bad at all (unless you've got 32-bit-specific dependencies in your project - and then you're in for a longer day, even after you've got things worked out... Separate DLLs, separate installers, etc).

Vault Job Processor
Seems to be the same as the Explorer issues above.

Standalone Vault Applications
This is the one that really had me puzzled for a little while. I was naively thinking that I could leave my standalone Vault apps as 32-bit (even on a 64-bit machine). That might work in some cases, but if you've used any of the higher level pieces of the framework (VDF, etc) - then you're out of luck. Those pieces of the framework will attempt to load any Vault Extensions that you have loaded on the current machine - and seem to cause problems if ANY of the Vault Extensions don't match the bit-ness of your standalone app.

So - if you're doing standalone apps for sale, or other cases where you can't guarantee the bit-ness of the operating system, then you may need to do two versions of the apps... One 32-bit and one 64-bit.


All in all, this stuff just kind of snuck up on me - I don't recall hearing about it back at Developer Days - so it's been a more complicated upgrade process than I anticipated.

Good luck out there...