I’ve been setting up some PerformancePoint Planning demonstrations for both clients and internal knowledge transfer. As part of these demonstrations I’ve been loading the Account Dimension from CSV files.
There are several other ways of loading data into PerformancePoint planning dimensions and models and I’ll no doubt post about the alternatives in the future.
There is a small gotcha that I’d thought I’d share. The pre-defined Account Dimension contains a member property called Account Type. This member property utlises a lookup table for the various built-in account types such as Unit, Expense etc.
The PerformancePoint CSV format requires that the first row contains the field (or rather member property) names, and optionally data types, with the remaining rows that actual data. This is slightly different for the Account Type member property, as this is a lookup field, you need to specify the key field name instead, in this case, AccountTypeMemberId.
With that known, you would be forgiven in thinking that, in order to load data against that field, you need to specify the actual AccountTypeMemberId. However, that would result in a new member property being created called ‘AccountTypeMemberId’ that contains the value and not the description. The proposed destination field Account Type would be left unpopulated.
Instead, to correctly load the member property, rather than use the Id, you need to specify the actual description from the lookup table. (Not the only un-intuitive feature of PerformancePoint Planning!)
Celebrating International Women’s Day: from Classroom to Code
As we celebrate International Women’s Day, I want to share my journey of breaking stereotypes
Mar
Pretty Power BI – Adding Pagination to Bar Charts
Good User Experience (UX) design is crucial in enabling stakeholders to maximise the insights that
Feb
Pretty Power BI – Creating Dynamic Histograms
Good User Experience (UX) design is crucial in enabling stakeholders to maximise the insights that
Feb
Top Tips to Pass the Databricks Certified Data Engineer Professional Exam
Having recently passed the Databricks Certified Data Engineer Professional exam, this blog post covers some
Jan
Python vs. PySpark Navigating Data Analytics in Databricks – Part 1
Introduction When it comes to conquering the data analytics landscape in Databricks, two heavyweights, Python
Jan
Impact of AI on Business Analysis
Artificial intelligence (AI) is rapidly transforming our world, and this blog post concentrates on the
Jan
Creating Clickbait Using Python
In 2023, about 5 billion people used the internet. With so many people contributing and
Dec
A Brief Overview of Security in Microsoft Fabric
Where Fabric Sits in the Hierarchy As you are probably aware, Microsoft Fabric is Microsoft’s
Dec