Enterprise Analytics for Commercial Real Estate

Our client is a global company and is one of the top five commercial real estate brokers in the world. This project focused on data analytics and was a Qlik implementation and customization venture for Institutional Clients.

Impact

Our client came up with an idea to provide a more intimate level of support to their Institutional Customers. An example of an Institutional Customer would be a Fortune 100 company like IBM. These companies typically have hundreds of offices all over the globe. And, if you look at their operations from the number of their offices, real estates, and leases, they are like a mini commercial real estate broker – one that is a brokering and leasing company in one. The idea was that leveraging technology could give them a deeper level of slicing and dicing capabilities into all these Fortune 100 companies, including investments in real estate, occupancy rates, and optimizations of their leases.

Solution

In order for them to be successful on this journey, they had two primary goals. The first was to bring all the pertinent data into one place. The second was to build a data integration platform and an enterprise data warehouse. For the technology stack, the ETL process (or the data aggregation process) utilized a Microsoft SQL server (integration services). The data warehouse was on the SQL server for the Microsoft Stack, while the data visualization platform was Qlik. To complete this project, the team was divided into two sub-teams with one focused on the data integration and data warehouse aspects and one on Qlik implementation. We had a team of two data architects and one Qlik architect here in Dallas. We also had four data developers and three Qlik developers in our New Delhi office.

Result

The entire project took a little over 12 months and in the end, we were able to onboard almost 15 institutional customers onto this platform. The updated magnitude for transactions across the center, between the client’s team and the customer’s team, could accommodate a little over 400 users. This was a large data warehouse, associated with terabytes and terabytes of size in the database. Add to that the complexity of 15 institutional customers, and we’re talking about close to one thousand different data integration jobs that need to be updated and validated every night.