While necessary for normalization purposes, it’s less useful in a data model where you want only those columns used for analysis or to establish table relationships. In a fact table, the primary key is used to uniquely identify each row. For most tables, such as those that contain customer, employee, or sales data, you’ll want the table’s primary key so that you can use it to create relationships in the model.įact tables are different. Many tables, including fact tables, have primary keys. The second example involves omitting the primary key column when importing a fact table. None of these columns are needed in the model and should be deselected when you import data. Columns like “create date”, “update date”, and “ETL run” are created when the data is loaded. In a data warehouse, it’s common to find artifacts of ETL processes that load and refresh data in the warehouse. The first example relates to data that originates from a data warehouse. Two examples of columns that should always be excluded You can always add new columns later if you need them. If it doesn’t or you aren’t sure, leave it out. If you want to build an efficient model, look at each column and ask yourself whether it contributes to the analysis you want to perform. The most memory-efficient column is the one that you never imported in the first place. Nothing beats a non-existent column for low memory usage The section on Datetime optimizations covers this technique in detail. In some cases, it is better to have multiple columns with a low number of unique values rather than one column with a high number of unique values.
Note: The differences in storage requirements for individual columns can be huge. Why are we talking about compression and unique values? Because building an efficient model that minimizes memory usage is all about compression maximization, and the easiest way to do that is to get rid of any columns that you don’t really need, especially if those columns include a large number of unique values.
The more unique values, the more memory is required to store them. The degree of compression actually achieved depends primarily on the number of unique values in each column. For example, if you’re importing 7 MB of data from a SQL Server database, the data model in Excel could easily be 1 MB or less. On average, you can expect a data model to be 7 to 10 times smaller than the same data at its point of origin. The engine implements powerful compression techniques to reduce storage requirements, shrinking a result set until it is a fraction of its original size. Using DAX calculated measures instead of columnsĭata models in Excel use the in-memory analytics engine to store data in memory. What if we need the column can we still reduce its space cost? What about filtering just the necessary rows?
In this articleĬompression ratios and the in-memory analytics engine It analyzes your Excel workbook and if possible, compresses it further. Taking the time to learn best practices in efficient model design will pay off down the road for any model you create and use, whether you’re viewing it in Excel 2013, Microsoft 365 SharePoint Online, on an Office Web Apps Server, or in SharePoint 2013.Ĭonsider also running the Workbook Size Optimizer.
#EFFECIENT IN MICROSOFT OFFICE PLATFORM HOW TO#
In this article, you’ll learn how to build a tightly constructed model that’s easier to work with and uses less memory. For workbook data models that contain millions of rows, you’ll run into the 10 MB limit pretty quickly. Finally, in Microsoft 365, both SharePoint Online and Excel Web App limit the size of an Excel file to 10 MB. Second, large models use up valuable memory, negatively affecting other applications and reports that share the same system resources. First, large models that contain multitudes of tables and columns are overkill for most analyses, and make for a cumbersome Field List. There's effectively little difference between these versions of Excel.Īlthough you can easily build huge data models in Excel, there are several reasons not to. However, the same data modeling and Power Pivot features introduced in Excel 2013 also apply to Excel 2016. Note: This article describes data models in Excel 2013.