Why I am now a bimsync fanboy

Those of you who know me know that I recently changed my employer and I am know working for a real estate developer, with a different scope of work than in my previous position. This lead me to put aside Revit and Dynamo for a while, and think more about a project-wide collaboration platform.

Alongside with the ubiquitous BIM 360 platform from Autodesk, there is a lot of more confidential solutions from various developers. Every large BIM editor has its own, and many other companies propose one. Among them, bimsync is a rather discrete application from Catenda, a Norwegian company. Having eared about it at the French Revit User Group, I had to give it a test, and I was not disappointed.

What immediately caught my attention is the web viewer, which is the best I have ever tried. They develop their own IFC web-based viewer, and it is not far from perfect. The viewer is extremely easy to use with the left mouse button and the wheel, and include every needed feature. Sectioning the model is also quite well implemented and do a very good job. My only wish here is to have a filled pattern to highlight the difference between fill and void while sectioning.

sectionnningwithdiff

The viewer is also quite powerful, I tried it with 1 Go worth of IFC files, it run “almost” smoothly on my iPad.

onipad2

bimsync does a good job at uploading, viewing and managing versions of various models, and provides a thoughtful way of managing revisions.

You start by creating a “model” which on bimsync is a placeholder where you will upload the various versions of an IFC file. Once this file is processed by the platform, it will be available for review along with the other models. The processing part can be rather slow, it takes more than one hour for a 500 Mo IFC file, but happen entirely online, you don’t even have to keep your computer open.

models

A key feature of bimsync is the ability to easily extract Excel schedules from the uploaded models. Having the ability to show data from a model in a nice spreadsheet is priceless, and this is something that is generally overlooked by their competitors.

The issue tracking solution integrated in bimsync is also very efficient and well-thought-out, with a lot of nice features.

You start by creating an issue directly on the model, can assign someone responsible for solving this issue, add a due date and write a few lines of comment.

issues

These issues are grouped by boards, and you can create as many board as you want. You can also keep track of the resolution of these issues with a few statistical tools and filters, and save reports in Excel.

statistics2

You know that I am a big fan of the BCF concept, and bimsync doesn’t disappoint me in this regard, by providing a first-class BCF 1 and 2 support. You can export your issues in BCF to display them directly in your authoring platform of choice.

Catenda was kind enough to provide me with an access to their API, and after a few tests, I found them quite easy to use and powerful. I think this enable very interesting workflows, like automatically displaying key metrics in an easy to consume Power BI dashboard.

I yet have to explore all the features, especially the libraries, but bimsync is now my top choice among the web-based BIM collaboration platforms, and I am eager to explore more workflows with it.

Flux

After showcasing Metro, a web-based interface for interpreting and visualizing building codes, Flux has released its new product, simply called Flux. I just had the chance to get an invitation to their beta, and I give you here the results of my first experimentation.

fluxLogo

Flux is the first startup to spin out of the semi-secret Google X lab. Its goal is to build a platform to design buildings more easily, but also more ecologically. So I was pretty excited when I heard about their new product, a suite of tools to link together my favorite playthings, Dynamo, Grasshopper, and Excel

Flux works as a central repository for exchanging data between Grasshopper, Excel or Dynamo. Along with the website, Flux provides plugins for these solutions. As we work in our favorite design tool, we use the Flux plugin to upload or download data to or from the Flux central server.

workflows

 

Flux is organized around projects, each project containing a set of Data Key. These Data Key store values retrieved from Excel, Dynamo or Grasshopper. We can’t edit these values directly in the Flux interface, but we can display them in the Data View.

DataView

Once in Flux, Data Keys can be linked together in the Flow. This interface displays a visual programming language to transform data as they pass through Flux.

FlowView

The initial tutorial shows us how to exchange data between Excel and Grasshopper. After going through this starter project, I give a try to the Dynamo plug-in.

I develop upon a common workflow, where an HVAC engineer retrieves MEP spaces location and area from a Revit model and define in Excel a set of values to be uploaded in Revit. For the sake of this experience, I am using the Specified Supply Airflow, but this should work with any value, such as the occupancy of a room or the section of a duct.

I am using here one of my project, which contains a thousand MEP Spaces, and retrieve some of its parameters in Dynamo. Using the GetParameterValueByName node, I retrieve four lists for spaces names, numbers, areas, and levels.

RetriveProperties

The Flux plug-in for Dynamo presents itself as a set of six nodes and allows us to select a project, find data key in this project, and get or push values from or to Flux. I connect my GetParameterValueByName nodes to the ToFlux nodes, and these values are uploaded to Flux.

ToFlow

I open a new Excel spreadsheet and use the Flux plugin to create three columns, for Name, Number and Area of the MEP Spaces. Flux automatically fills the spreadsheet with the values retrieved in Revit. I create a fourth column for Specified Supply Airflow and fill in some airflow values. As I hit enter, these values are uploaded to Flux and displayed in the Data view.

Excel

Back in Dynamo, I create a third group of nodes and link the FromFlux node to a SetParameterByName node. This completes the loop and every Specified Supply Airflow values defined in Excel are added to the MEP Spaces.

FromFlux

The entire workflow takes some time to set up, but the result is pretty impressive, and I see many possibilities around this kind of web-based exchange. Flux also integrates the possibility to upload geometry created in Grasshopper or Dynamo, and I still have a lot to test with this new tool.

Model Timestamp

As we receive models from subcontractors or partners, we need to integrate them in a coordination model.

The coordination model files structure look like this.filestructure

In the coordination model, we use linked views and model specific overrides to fine tune model display. To keep these settings when a linked model is updated, we just override the previous liked file with its new version. This process implies to rename the file each time we receive a new version from a subcontractor. So when we receive a file named with a date or a version, we rename it along some quality control checks.

process

But we also have to follow which model version we are linking in our coordination model. Renaming files is great to keep the link alive, but we lost the original name in the process.

To keep track of the version of the linked file, I create some kind of timestamp on every object of a given model. This application writes version information on four shared parameters, common to every object.

Once in the coordination model, these shared parameters allows us to know from which version a given element came from.

IdentificationData

They can also be used to create filters to highlight the origin of each element in a view.

I also find some very interesting side effects. For example, I create a linked models schedule with a multi-category schedule displaying only the four shared parameters.

LinkedModelSchedules

My only concern is the performance of such an application. I run it on the Revit MEP example file, and it take 31 seconds, regeneration included. It could easily handle a larger model, but the user will then need some patience as the application run.

You will find below a piece of code I use to write values on every elements of the model. This code does not include any interface, but I hope to be able to publish a packaged version anytime soon.

 

public void ModelTimeStamp()
{
	Document doc = this.ActiveUIDocument.Document;
	
	using (Transaction tx = new Transaction(doc)) {

		tx.Start("Model TimeStamp");

		//Create a list of category
		CategorySet myCategories = CreateCategoryList(doc, this.Application);

		//Retrive all model elements
		FilteredElementCollector collector = new FilteredElementCollector(doc);
		IList<ElementFilter> categoryFilters = new List<ElementFilter>();

		foreach (Category category in myCategories)
		{
			categoryFilters.Add(new ElementCategoryFilter(category.Id));
		}

		ElementFilter filter = new LogicalOrFilter(categoryFilters);

		IList<Element> elementList = collector.WherePasses(filter).WhereElementIsNotElementType().ToElements();

		//Add the value to all element
		if (elementList.Count > 0)
		{
			foreach (Element e in elementList)
			{
				WriteOnParam("Date", e, DateTime.Now.ToShortDateString());
				WriteOnParam("Version", e, "First Release");
				WriteOnParam("FileName", e, "SubContractors Model");
				WriteOnParam("Trade", e, "HVAC");
			}
		}

		tx.Commit();
	}

}

private void WriteOnParam(string paramName, Element e, string value)
{
	IList<Parameter> parameters = e.GetParameters(paramName);
	if (parameters.Count != 0)
	{
		Parameter p = parameters.FirstOrDefault();
		if (!p.IsReadOnly)
		{
			p.Set(value);
		}
	}
}

private CategorySet CreateCategoryList(Document doc, Autodesk.Revit.ApplicationServices.Application app)
{
	CategorySet myCategorySet = app.Create.NewCategorySet();
	Categories categories = doc.Settings.Categories;

	foreach (Category c in categories)
	{
		if (c.AllowsBoundParameters && c.CategoryType == CategoryType.Model)
		{
			myCategorySet.Insert(c);
		}
	}

	return myCategorySet;
}

Revit Options

Revit options have been a mystery for me for quite a long time, so I decide to write a few lines about it in order to better understand their possibilities.

To do so, let’s model a small house, nothing fancy, but enough to have some possibilities for evolution. This house is drawn in the Main Model, and will be common to all options.

SomeHouse

Now let say I want to try something. I put try on emphasis, because, it’s really what Design Options are useful for. I open the Design Options editor, and create a new option set, which is automatically populated with a first option. I rename it like this:

CreateOptions

I select this new option as the current one using the Option drop-down menu on the bottom of the Revit screen, and start modeling a duct layout.

FirstLayoutOption

To create a second routing option, I duplicate the first one, and rename it. Every elements of my First Routing Option are now duplicated in my Secondary Routing Option.

CreateSecondOption

I edit these duplicated elements to create a second duct layout. In the process, I realize than families edited during option editing are actually edited for the whole model, so this kind of change will impact every other option.

SecondLayoutOption

I have now two different duct layouts in my model, which are displayed when I select one or the other design option in the Design drop down menu.

But these options can also be displayed on a per view basis. As you created some design options, a new panel appear on your Visibility Override, allowing you to select an option to be displayed.

OverrideOptions

This can be used to display our two options on the same sheets:

Options

It can also be used to display the metrics of each option side by side to decide which one should be keep. For example, I show here a duct schedule and a duct fitting schedule for each option:

Schedule

These data can give us powerfull insight for choosing an option. Once one of them is validated, the Design Options provides us some tools to integrate our option in the main project.

We first have to Make primary our selected option, here the second one. We can see this one becoming visible from the Main Model option.

MakePrimary

Since linked models only display their primary options, you have to make sure that selected option are primary before using it as a reference.

Selecting Accept Primary integrate our primary option in the Main Model option, and delete every other option associated with the Option Set. Our option is now part of our model.

Revit Database

It has been a while since I want to export a whole Revit project to a database. I had a few prior experiences with MySQL and the SQL management software Toad, but never with the Revit database.
This time, I decide to use the SQL Server, along with its SQL Server Management Studio, the solution developed by Microsoft.
During my first research for exporting the Revit database, I came across the Revit DB Link, a plug-in available at Autodesk Subscription website. This add-in allows to export the Revit database, but also to edit this database and import it back into Revit.
Using it is pretty easy when you have all the correct software installed.

Exporting

First, you have to configure a new connection. After starting the Revit DB Link add-in, select [Select a new connection] in the ODBC panel, and click Export:

LinkInterfaceType a name for your export configuration, and select New. Select the SQL Server database, and follow the indication for creating the link file toward your database.
At this windows, fill in the description of your database, and select the SQL server you want to connect. Here, I am working with an SQL server named SQLEXPRESS:

DBSelection
Select a specific database for this export. I have created mine through SQL Management Studio before starting my export. After a summary page, the connection is established.
The export run smoothly, and I get a bunch of tables, one per Revit category, along with others with a more cryptic name.

Using

One of the most obvious application is to create multi-model schedules. Creating a schedule on multiples Revit files can be very tedious, especially with large models. Exporting every Revit files in its own database allow us to merge quantities with a simple UNION SQL command. You just have to make sure to export each new model to a new database to avoid DB link to override previous model export.
Most parameters editable in Revit can also be changed in the database and imported back in Revit model.
As an example, you can edit duct width in the database, and import back values to modify the Revit duct sizes:
 SQLEDit
Before:
BeforeAfter:
After
There is also some specific tables to access relations between Revit objects.
For example, the DoorWall table links each door with its hosting wall. The RoomAssociations table allows us to retrieve every elements inserted into a specific room, to create furniture schedules for example.
Exporting the Revit database to an SQL Server can be a very powerful tool, and provide us new means for creating complex schedules.