Sloth, the online BCF viewer

I am a vocal proponent of the BIM Collaboration Format. I think issue tracking is what is missing in the building industry, and the BCF provide a standard to do just that. In the meantime, more and more tools are supporting BCF, with the notable exception of Autodesk products, even if add-ons can fill this gap for Revit and Navisworks.

Yet, convincing people to leave their spreadsheet-based ways of working to go on BCF-based workflows still need some form of paper-based reports extracted from these BCF issues. Most of the BCF-based tools lacks this sort of reporting. In this regard, BIM Collab is the best one, and provide the ability to create PDF and Excel reports. But even BIM Collab won’t support Word documents, or the ability to use a custom template.

A while ago, I built a small utility to create Word reports from BCF files. However, this tool doesn’t support BCF 2.0, and lack the ability to create Excel reports. It was more than time to upgrade it. Instead of improving upon it, I decide to rebuild it, this time as a web-based tool, to make it more accessible.

So, I introduce you my very first “web-app”, Sloth, an online BCF reports generator.

Sloth BCF

You can upload a BCF file and see the resulting report in your browser, showing the main information in your issue: Tittle, author and creation date, along with the first screenshot in your issue.

Uploading a BCF file

Once your issues are loaded in your browser, you can export them in a report in a Word or Excel format, writing down all information found in this BCF file. Sloth use basic styles in Microsoft Word to highlight titles and dates.

Export a Word report

You can also upload a Microsoft Word template (.dotx) before exporting the Word report. Your template will be used to create this report.

Add a Word template

To read these BCF files, I am using the XbimBCF .Net library developed by the openbim.org team.  Xbim is following to the letter the BCF specification, adding every necessary check to ensure that all mandatory values are here, between the assigned limits. A few solutions don’t support these specifications, you will end up with the following error message if you try to upload a BCF file from them:

Error in the BCF file

I have tested Sloth with BCF files created in bimsync, Tekla BIMSight, BIMcollab and Solibri. I assume that BIMCollab’s Revit and Navisworks plugins will work as well, please let me know if it is not the case.

As usual, the application is open-source, licensed under the MIT Licence. You can find the source on GitHub. Please don’t hesitate to report any issues you might find, and happy reporting!

A new version of Group Clashes

I just updated my Clash Grouper plugin for Navisworks.

The main purpose of this update was to solve some issue encountered by early adopters. The plugin should now be more stable. Many thanks to the users who contributed to this release with their feedback and test models.

I also add some new features to the application, that I will showcase with this example. Detecting clashes in this simple layout of ducts (Selection A, in blue) and pipes (Selection B, in red) yield 8 clashes, spread like this:

An example of a clash test

Creating subgroups

Grouping with two condition will now create a first group per the first rule. It will then divide this first group into subgroups, following the second rule.

If we group this example by Selection A (the ducts), we get the following groups, named after the item used to create the group. All clashes involving the same item from selection A are grouped together:

Grouping by Selection A

If we add a new group rule, say, by grid intersection, the plugin will break these groups into subgroup, according to this new rule. Here, it will rename the existing group by adding the nearest grid intersection name, and break the group {Clash 7, Clash 8} into two, each one belonging to a different grid intersection.

Grouping by Selection A and Grids

Keep existing groups

You can now keep existing groups, so the Group Clashes will run only on the remaining clashes. This open more possibilities for complex workflows to group and sort clash results.

For example, you can group by status, explode the “Active” group and keep this “Approved” group. After running again the grouping function by level, it will only take into account the clash result, and keep the “Approved” group intact, allowing you to focus on “Active” group

Another example, if we create manually the following group in our previous clash test:

An existing group

We can then run a Group by Grid intersection on the remaining clashes to create the following groups for the remaining clashes, without disturbing the existing group.

Remaining clashes are grouped

Batch grouping

You can now select multiple clash tests to apply the same grouping rules to all of them

Batch processing

Along these new features, a few corrections have been made. Some groups where not properly named, this is now corrected, and these groups names should now be more meaningful. Handling missing grids or levels is now better managed, and the application should be generally more stable.

As usual, you will find this new version of Group Clashes on the Autodesk App Store. If you want to develop your own grouping rules, you can also find the entire source code on Github.

Happy grouping!

Why I am now a bimsync fanboy

Those of you who know me know that I recently changed my employer and I am know working for a real estate developer, with a different scope of work than in my previous position. This lead me to put aside Revit and Dynamo for a while, and think more about a project-wide collaboration platform.

Alongside with the ubiquitous BIM 360 platform from Autodesk, there is a lot of more confidential solutions from various developers. Every large BIM editor has its own, and many other companies propose one. Among them, bimsync is a rather discrete application from Catenda, a Norwegian company. Having eared about it at the French Revit User Group, I had to give it a test, and I was not disappointed.

What immediately caught my attention is the web viewer, which is the best I have ever tried. They develop their own IFC web-based viewer, and it is not far from perfect. The viewer is extremely easy to use with the left mouse button and the wheel, and include every needed feature. Sectioning the model is also quite well implemented and do a very good job. My only wish here is to have a filled pattern to highlight the difference between fill and void while sectioning.

sectionnningwithdiff

The viewer is also quite powerful, I tried it with 1 Go worth of IFC files, it run “almost” smoothly on my iPad.

onipad2

bimsync does a good job at uploading, viewing and managing versions of various models, and provides a thoughtful way of managing revisions.

You start by creating a “model” which on bimsync is a placeholder where you will upload the various versions of an IFC file. Once this file is processed by the platform, it will be available for review along with the other models. The processing part can be rather slow, it takes more than one hour for a 500 Mo IFC file, but happen entirely online, you don’t even have to keep your computer open.

models

A key feature of bimsync is the ability to easily extract Excel schedules from the uploaded models. Having the ability to show data from a model in a nice spreadsheet is priceless, and this is something that is generally overlooked by their competitors.

The issue tracking solution integrated in bimsync is also very efficient and well-thought-out, with a lot of nice features.

You start by creating an issue directly on the model, can assign someone responsible for solving this issue, add a due date and write a few lines of comment.

issues

These issues are grouped by boards, and you can create as many board as you want. You can also keep track of the resolution of these issues with a few statistical tools and filters, and save reports in Excel.

statistics2

You know that I am a big fan of the BCF concept, and bimsync doesn’t disappoint me in this regard, by providing a first-class BCF 1 and 2 support. You can export your issues in BCF to display them directly in your authoring platform of choice.

Catenda was kind enough to provide me with an access to their API, and after a few tests, I found them quite easy to use and powerful. I think this enable very interesting workflows, like automatically displaying key metrics in an easy to consume Power BI dashboard.

I yet have to explore all the features, especially the libraries, but bimsync is now my top choice among the web-based BIM collaboration platforms, and I am eager to explore more workflows with it.

Group Clashes

If you have already run some clash detection, you have probably ended up with thousand of clashes, and wondering how you could find something interesting in this mess.

neuro

Furthermore, finding clashes is not really useful in itself. The purpose of clash detection is to be able to find and hopefully solve issues in the design. And we quickly realize that a clash is not an issue in itself, but can be the symptom of an issue. Being able to extract a real issue from a meaningless bunch of clashes is what we are looking for. This is how we can gain some return from clash detection.

To do so, I tend to focus on specific subjects. Instead of running useless clash detections between entire model, I rather focus on specific issues and know beforehand what I am looking for.

For example, instead of running a clash detection between an architectural and a structural model, and end up with thousand of clashes, we can run a clash detection only between architectural rooms and structural concrete beams. As we know which type of element are involved in the clash detection, we can understand what a “clash” mean. Here, it means that there is a beam below the ceiling height. Furthermore, we can also group these clashes by room, and immediately highlight the problematic area where we can focus our efforts.

clearheadroom

To help in this regards, I created a plugin for Navisworks Manage to automatically group these clashes. After a beta version published last year, I finally take the time to properly develop a full-fledged application and publish it on the Autodesk App Store. It is a fully redesigned Group Clashes Navisworks plugin, with a new interface that integrates seamlessly into the Navisworks interface.

interface

The grouping rules have also been redesigned, and now include the following methods :

  • Group by Level: This rule will group clashes according to their nearest level, and name the group after the level.

bylevels

  • Group by Grid Intersection: This rule will group clashes according to their nearest grid intersection, and name the group after these grids.

bygrids

  • Group by Selection A: This rule will group all clashes belonging to an element of the selection A, and name the group after this element. As an example, if a room in Selection A has many clashes with beams in Selection B, all these clashes will be grouped.

byselectiona

  • Group by Selection B: This rule will group all clashes belonging to an element of the selection B, and name the group after this element.

byselectionb

  • Group by Assigned To, Approved By, and Status: These three rules will use various properties of the clash to group them. As an example, you can use this rule to group all clashes assigned to you.

You can also group with two rules, the first one will rename the clash group created by the second one. This enables various possibilities, like having clashes grouped by a given selection and renamed according to the one assigned to.

Group Clashes can have difficulties in managing clash reports with more than 10 000 clashes. Be smart when you set up your clash tests, and everything will be fine.

Group Clashes is now out of beta, and you will have to pay $10 to download it on the Autodesk App Store.

Yet, Group Clashes is also open-source. You can find the entire source code on Github, build your own plugin, and develop your own grouping rules.

Wall Openings (again)

I am kind of obsessed with wall openings. The entire process of asking a structural engineer for holes in its beloved wall to let ducts and pipes goes through has always been a rather frustrating experience.

After my first article about a semi-automated way for placing an opening family, here is my latest attempt at creating the perfect opening family.

A good opening family must have the following features:

  • hosted on a wall, a slab or a beam
  • visible in a 3D view
  • fully parametric
  • schedulable
  • sharable in its own model
  • a nice 2D representation

Lets explore these features.

To host them, but still keep the ability to share them in their own model, I use the Generic Model Face Base family template to create my opening family. This template allows me place my opening on any wall, slab or beam, even if they are in a linked model.

In this Generic Model, I create the Opening subcategory, where I will place every element of my opening. This will allow me to fine tune the display of my openings in my model.

I create a bunch of reference planes, and drive them with three shared parameters, Width, Height and Depth. These reference planes help me constrain the Void Form that will cut the host. This void form will create an actual hole in the wall or slab, and will allow me to perform accurate clash detections after integrating these openings.

To be able to see my opening in a 3D view, I draw some Model Lines to outline the general shape of the opening, and place them in the “Opening” subcategory. These Model Lines are only visible in 3D.

3D View

The 2D representation is a pretty complex subject, and I have yet to find the perfect solution. After some experiences with Model lines, I have switched to annotations elements. These annotations elements are drawn in two nested families, one for the projection symbol, the other for the cut symbol.

These annotation families are drawn in a Generic Model, with the “Opening” subcategory, in order to follow the same graphical rules than the main family.

I also use Masked Regions to draw filled patterns, and use the Generic Model Override in a plan view to fill them with black. I am not entirely satisfied with this solution, but the workaround involve Detail Items, and I don’t want to deal with two different categories.

PlanSection 1

Section 2

To display the elevation of my opening family in a tag, or a schedule, I create two shared parameter, Top Elevation and Bottom Elevation.

As I was searching for a solution to calculate the elevation, I notice a feature I wasn’t aware of, the “Schedule Level”, present since at least Revit 2015, that allow us to define a reference level. Revit use it to automatically calculate the corresponding elevation. Since this elevation cannot be used directly in a schedule or a tag, I use a Dynamo definition to update elevation values in my shared parameters. This Dynamo definition perform some simple calculation to retrieve Top and Bottom elevations, and send these values in the corresponding shared parameter.

Dynamo

My work with wall openings is far from complete, and subjects like sharing these openings, and managing their modifications are still pending. You will find here the different families, models and Dynamo definition used in this article, feel free to use or improve on them.

Flux

After showcasing Metro, a web-based interface for interpreting and visualizing building codes, Flux has released its new product, simply called Flux. I just had the chance to get an invitation to their beta, and I give you here the results of my first experimentation.

fluxLogo

Flux is the first startup to spin out of the semi-secret Google X lab. Its goal is to build a platform to design buildings more easily, but also more ecologically. So I was pretty excited when I heard about their new product, a suite of tools to link together my favorite playthings, Dynamo, Grasshopper, and Excel

Flux works as a central repository for exchanging data between Grasshopper, Excel or Dynamo. Along with the website, Flux provides plugins for these solutions. As we work in our favorite design tool, we use the Flux plugin to upload or download data to or from the Flux central server.

workflows

 

Flux is organized around projects, each project containing a set of Data Key. These Data Key store values retrieved from Excel, Dynamo or Grasshopper. We can’t edit these values directly in the Flux interface, but we can display them in the Data View.

DataView

Once in Flux, Data Keys can be linked together in the Flow. This interface displays a visual programming language to transform data as they pass through Flux.

FlowView

The initial tutorial shows us how to exchange data between Excel and Grasshopper. After going through this starter project, I give a try to the Dynamo plug-in.

I develop upon a common workflow, where an HVAC engineer retrieves MEP spaces location and area from a Revit model and define in Excel a set of values to be uploaded in Revit. For the sake of this experience, I am using the Specified Supply Airflow, but this should work with any value, such as the occupancy of a room or the section of a duct.

I am using here one of my project, which contains a thousand MEP Spaces, and retrieve some of its parameters in Dynamo. Using the GetParameterValueByName node, I retrieve four lists for spaces names, numbers, areas, and levels.

RetriveProperties

The Flux plug-in for Dynamo presents itself as a set of six nodes and allows us to select a project, find data key in this project, and get or push values from or to Flux. I connect my GetParameterValueByName nodes to the ToFlux nodes, and these values are uploaded to Flux.

ToFlow

I open a new Excel spreadsheet and use the Flux plugin to create three columns, for Name, Number and Area of the MEP Spaces. Flux automatically fills the spreadsheet with the values retrieved in Revit. I create a fourth column for Specified Supply Airflow and fill in some airflow values. As I hit enter, these values are uploaded to Flux and displayed in the Data view.

Excel

Back in Dynamo, I create a third group of nodes and link the FromFlux node to a SetParameterByName node. This completes the loop and every Specified Supply Airflow values defined in Excel are added to the MEP Spaces.

FromFlux

The entire workflow takes some time to set up, but the result is pretty impressive, and I see many possibilities around this kind of web-based exchange. Flux also integrates the possibility to upload geometry created in Grasshopper or Dynamo, and I still have a lot to test with this new tool.

Time Stamper, the Add-In

There is no easy way to override the color of an entire Revit link. Since most of my work involves linking Revit model from various subcontractors, this is something I miss badly.

Until recently, I was still using some workset hack to create filters on linked models. These filters were allowing me to display linked model with my color of choice.

But relying on workset leave much to be desired, and I have to find another solution.

I recently came up with a different solution, where each model element know where they came from.

The idea is to add file name, date and version in shared parameters on every model element. I created the corresponding Revit Add-In, and it is now available on the Autodesk App Exchange.

This application will:

  • Ask for identification information
  • Create four shared parameters on every category
  • Fill these shared parameters with identification information

The code itself is pretty easy, but there is a lot of applications.

First, it becomes easy to create filters on linked model. These filters allow us to display each linked model with a different color.

ColorsByModel

But it also enables us to tag the origin of every element, like we can see in the following screenshot.

IdentifyElements

You can create a linked models schedule, with date and version.

LinkedFilesSchedule

To help you create filters and tags with these parameters, you will find here the shared parameter text file. Please also note that the application uses a list of categories to create the four shared parameters. You will find this list here.

I hope this application will help you in your work, don’t hesitate to share your suggestions in the comments.

Some Thoughts About Rooms and Spaces

A lot of MEP calculations available in Revit needs analytical spaces at some point, and it is crucial to have them properly modeled.

Properly placing these spaces can be a tedious business, especially for large buildings with hundreds of rooms. You can use the Place Space Automatically function, but it can create spaces in undesired places, and does not necessarily match the architectural rooms.

Furthermore, superposing of a space upon an existing architectural room only retrieves the name and the number of the corresponding room, and falls short for any other property we could want to add to our MEP space.

One of the solutions for creating MEP spaces is to match rooms created in the linked architectural model. To do so, I wrote a small application for creating a space for each room defined in a linked model.

This application loops on every linked instance, searching for rooms, and uses these rooms to create the corresponding space, retrieving in the process a handful of parameters, like name, number, and most critically, Limit Offset.

You can find here a little ScreenCast showing how it works. The entire source code is available below.

While working on it, I also found some interesting property : Computation Height. This is a level property, and it allows to define the height used to define boundaries on this level.

Let’s create three rooms on a given level:

ThreeRooms

But if we add some variation on the floor level, the room disappears with the following warning:Warning

ARoomDisappear

By default, these rooms are calculated at the level elevation. Every wall at “0 m” above the level will be used as a room boundary.

The Computation Height properties allows us to change the elevation where we calculate the room. In our example, we change the Computation Height of the Level 1 to 1 meter, and the room fit nicely between its boundaries.

ComputationHeight

Of course, this is also true for Spaces.

public void RoomToSpace()
{
	Document activeDocument = this.ActiveUIDocument.Document;
	
	//Get All linked instance
	FilteredElementCollector collector = new FilteredElementCollector(activeDocument);
	List<RevitLinkInstance> linkInstances = collector.OfCategory(BuiltInCategory.OST_RvtLinks).WhereElementIsNotElementType().ToElements().Cast<RevitLinkInstance>().ToList();
	
	//Get all levels
	collector = new FilteredElementCollector(activeDocument);
	List<Level> levels = collector.OfCategory(BuiltInCategory.OST_Levels).WhereElementIsNotElementType().ToElements().Cast<Level>().ToList();
	
	using (Transaction tx = new Transaction(activeDocument)) {
tx.Start("Create Spaces");

//Loop on all linked instance
foreach (RevitLinkInstance linkInstance in linkInstances) {
	
	//Get linked document
	Document linkedDocument = linkInstance.GetLinkDocument();
	
	//Get linked instance position
	Transform t = linkInstance.GetTotalTransform();
	
	//Get rooms in the linkedDocument
	collector = new FilteredElementCollector(linkedDocument);
	List<Room> linkedRooms = collector.OfCategory(BuiltInCategory.OST_Rooms).ToElements().Cast<Room>().ToList();
	
	//Create a space for each room
	foreach (Room room in linkedRooms) {
LocationPoint locationPoint = room.Location as LocationPoint;
XYZ roomLocationPoint = locationPoint.Point;
roomLocationPoint = t.OfPoint(roomLocationPoint);

if (roomLocationPoint != null)
{
	Level level = GetNearestLevel(roomLocationPoint,levels);
	UV uv = new UV(roomLocationPoint.X, roomLocationPoint.Y);
	
	Space space = activeDocument.Create.NewSpace(level,uv);
	
	space.Number = room.Number;
	space.Name = room.Name;

	Parameter limitOffset = space.get_Parameter(BuiltInParameter.ROOM_UPPER_OFFSET);
	limitOffset.Set(room.get_Parameter(BuiltInParameter.ROOM_UPPER_OFFSET).AsDouble());
}
	}
}

tx.Commit();
	}
	
}

private Level GetNearestLevel(XYZ point,List<Level> levels)
{
	Level nearestLevel = levels.FirstOrDefault();
	double delta = Math.Abs(nearestLevel.ProjectElevation - point.Z);
	
	foreach (Level currentLevel in levels) {
if (Math.Abs(currentLevel.ProjectElevation - point.Z) < delta) {
	nearestLevel = currentLevel;
	delta = Math.Abs(currentLevel.ProjectElevation - point.Z);
}
	}
	
	return nearestLevel;
}

Model Timestamp

As we receive models from subcontractors or partners, we need to integrate them in a coordination model.

The coordination model files structure look like this.filestructure

In the coordination model, we use linked views and model specific overrides to fine tune model display. To keep these settings when a linked model is updated, we just override the previous liked file with its new version. This process implies to rename the file each time we receive a new version from a subcontractor. So when we receive a file named with a date or a version, we rename it along some quality control checks.

process

But we also have to follow which model version we are linking in our coordination model. Renaming files is great to keep the link alive, but we lost the original name in the process.

To keep track of the version of the linked file, I create some kind of timestamp on every object of a given model. This application writes version information on four shared parameters, common to every object.

Once in the coordination model, these shared parameters allows us to know from which version a given element came from.

IdentificationData

They can also be used to create filters to highlight the origin of each element in a view.

I also find some very interesting side effects. For example, I create a linked models schedule with a multi-category schedule displaying only the four shared parameters.

LinkedModelSchedules

My only concern is the performance of such an application. I run it on the Revit MEP example file, and it take 31 seconds, regeneration included. It could easily handle a larger model, but the user will then need some patience as the application run.

You will find below a piece of code I use to write values on every elements of the model. This code does not include any interface, but I hope to be able to publish a packaged version anytime soon.

 

public void ModelTimeStamp()
{
	Document doc = this.ActiveUIDocument.Document;
	
	using (Transaction tx = new Transaction(doc)) {

		tx.Start("Model TimeStamp");

		//Create a list of category
		CategorySet myCategories = CreateCategoryList(doc, this.Application);

		//Retrive all model elements
		FilteredElementCollector collector = new FilteredElementCollector(doc);
		IList<ElementFilter> categoryFilters = new List<ElementFilter>();

		foreach (Category category in myCategories)
		{
			categoryFilters.Add(new ElementCategoryFilter(category.Id));
		}

		ElementFilter filter = new LogicalOrFilter(categoryFilters);

		IList<Element> elementList = collector.WherePasses(filter).WhereElementIsNotElementType().ToElements();

		//Add the value to all element
		if (elementList.Count > 0)
		{
			foreach (Element e in elementList)
			{
				WriteOnParam("Date", e, DateTime.Now.ToShortDateString());
				WriteOnParam("Version", e, "First Release");
				WriteOnParam("FileName", e, "SubContractors Model");
				WriteOnParam("Trade", e, "HVAC");
			}
		}

		tx.Commit();
	}

}

private void WriteOnParam(string paramName, Element e, string value)
{
	IList<Parameter> parameters = e.GetParameters(paramName);
	if (parameters.Count != 0)
	{
		Parameter p = parameters.FirstOrDefault();
		if (!p.IsReadOnly)
		{
			p.Set(value);
		}
	}
}

private CategorySet CreateCategoryList(Document doc, Autodesk.Revit.ApplicationServices.Application app)
{
	CategorySet myCategorySet = app.Create.NewCategorySet();
	Categories categories = doc.Settings.Categories;

	foreach (Category c in categories)
	{
		if (c.AllowsBoundParameters && c.CategoryType == CategoryType.Model)
		{
			myCategorySet.Insert(c);
		}
	}

	return myCategorySet;
}

DynaWorks

During my researches for improving clash detection, I stumble upon DynaWorks, a great Dynamo package built by Adam Sheather (@Gytaco).

For those who haven’t heard about Dynamo (Is there is any left?), it is a visual programming interface for Revit, pretty much like Grasshopper. Just like Grasshopper, you can improve upon the built-in features with packages from third party developers. DynaWorks is one of these packages, and provides a set of Dynamo nodes for interacting with Navisworks.

After installing it through the package manager, it presents itself with a set of nodes exposing Navisworks main functions.

Along with the package, some examples are provided. The NavisClashesElementUpdate.dyn contains a definition for retrieving clash results from Navisworks, I use it for my first steps with DynaWorks. If, like me, you want to use these definitions with Navisworks 2016, don’t forget to use Adam Sheather’s trick for updating your Dynamo definitions

To use DynaWorks, you first have to create a Navisworks file, and prepare your clash detections. Here, I create a basic detection between Walls (First Clash Item, in Blue) and HVAC elements (Second Clash Item, in Red), and load the resulting NWF file in my Dynaworks definition.

TestSelection

Basically, this definition retrieves every clash result in every clash test, selects one side of the clash detection, gets the involved element, extracts its id and uses it to update the Comment parameter in Revit.

process

Simple, but the result is pretty impressive. Instead of painfully getting element Ids in the Navisworks report, DynaWorks retrieves them for us, and updates elements in Revit accordingly. When applying a filter in the Revit view, we get this:

result

With some tweak to this initial definition, I was able to implement some king of dynamic clash detection, with a live update of the Navisworks file as we update the Revit model, but converting an entire Revit model in NWC is way too slow to implement this solution in a production environment. Please feel free to use it if you want to try it by yourself.

DynaWorks contains many other functions to retrieve Navisworks views or selection sets in Revit, and I think this can be the beginning of a great work flow intertwining Revit and Navisworks.