Upgrade your web presence with Framer

Elevate Your Experience.
Get Started Now.

Elevate Your Experience.
Get Started Now.

Schedule a call with Goran B.

Schedule a call with Goran B.

Babarogic © 2023. Designed by Goran Babarogic

Babarogic © 2023. Designed by Goran Babarogic

Babarogic © 2023. Designed by Goran Babarogic

Revit to SQL

Revit to SQL

Revit to SQL

XXXX

As building design and production evolves with technology, the AEC industry is fast becoming one of the world's largest producers of data. Data is therefore the foundation of every BIM-based construction project. Without data, a BIM model is nothing more than 3D geometry.

FMI Corp. has published a white paper on the impact of big data on the aec industry. The long-term benefits of using big data as a business tool are illustrated in this white paper from FMI, which addresses some of the most challenging elements of using big data and shows the opportunities that arise when big data and analytics are properly applied.

Some the key findings are:

  • 96% of all data goes unused in the E&C industry

  • 13% of working hours are spent looking for project data and information.

  • 30% of E&C companies are using applications that don’t integrate with one another.

  • 90% of data generated is unstructured. This includes tweets, photos, customer purchase history and even customer service call logs.


Transform Revit Data into Useful Information

Dynamo provides users with important data mining capabilities. Using packages we can import, modify and export Revit data to various external sources (Excel, Microsoft SQL, MySQL or SQLite). With this method it is possible to store data directly in a SQL database, integrate it directly into another software or elsewhere.

The most important advantage of Dynamo is that it allows us to filter information and export it to various forms of data serialization, including XML, JSON, HTML and CSV. With this method it is possible to store data directly in a SQL database, integrate it directly into another software or elsewhere.

Collect data as a structured format

We start by selecting the data we will use. For example, we need the parameters of the elements to check model health. What we are going to do is to take the data about this and run the necessary checks on this data.

In the next step we remove the wrong data. For example, we can detect if there are any outliers in our dataset that we should remove or if there are any missing values in our set that we should include. In this way, we remove the parts that give null or incorrect results. As an additional step, we need to normalize the data because it comes in texts, numeric, binary, etc.

I created all this data collection and data analysis part using dynamo and python. I defined all model checking operations with code and as a result I provided structured data. Then the data obtained was transferred to sql database to be used.