Google
 

Friday, August 29, 2008

Sacha Tomey's blog – PPSCMD GUI

Sacha was kind enough to link to some of my posts from the Train the Trainer sessions in Denver.

He has published a simple GUI for the PPSCMD command.

Looking for some automation of jobs from here too and the undocumented commands.

PerformancePoint Server 2007 PPSCMD GUI

I've built a really simple GUI for a couple of commands of the PPSCMD utility.  I always take far too long to work out the syntax and navigate to the appropriate directory (Yes, I ought to update the PATH environment variable) that I felt I could justify building a simple front end to help speed up the usage.

So far I've only implemented the MIGRATE and REPROCESS commands - I use these quite a lot outside of any automated SSIS package so they seemed the most sensible to implement in the GUI first.  I do intend on extending it to encompass some of the other commands and I would welcome any feedback towards prioritisation, usage, features and the inevitable bugs.  It's currently version 0.1 and more or less ready for 'Community Preview' - there are some omissions such as full error handling and validation that I do intend on implementing over time along with the other commands.

It's a .NET 3.5 application so you will need to deploy it to a client where you are happy to install .NET 3.5 if it's not already present.

You can download version 0.1 from here - (You will need to create a login first)

Sacha Tomey's blog

Tuesday, August 26, 2008

Andrew Fryer's Blog – Memory settings for SQL & Analysis Services

His memory settings seem very conservative but worth a shot implementing within PerformancePoint VPC.

You can decide on different values based on your workload. Keep in mind that less memory will mean less pages in memory (more IO) on SQL and potentially less space for Analysis Services to cache queries. Processing will also take a hit depending on size since it copies the cube into memory for processing.

  • SQL Server Database - Set SQL to use max 40% physical memory
  • Windows - Set <LimitSystemFileCacheSizeMB> to 20% of total, 65022 MB * .20 = 13004 MB.  This limits Windows file system cache usage of memory.
  • Analysis Services- Set SSAS to use max 35% memory, edit the msmdsrv.ini file (make backup first) and set <TotalMemoryLimit>35

Set <PreAllocate>35 (this means total percent of physical), to preallocate memory at server startup. This helps for NUMA enabled servers especially I found. More on SSAS properties can be found here and more tuning analysis Services can be found in this white paper.

If there are problems with runaway queries using too much memory, you can run ASGovernor to cancel queries if memory usage is greater than 40%.  (5% past target for SSAS)  It might be that this is too large and it needs to have a smaller threshold. 

Andrew Fryer's Blog

Tips from Train the Trainer Sessions – Day 4

Rules

  • Sometimes you need to remove carriage returns & redo return and put in extra spaces to fix validation issues. 
  • For SQL rules & the time dimension, ensure you use Add Lowest instead of Add Descendants.
  • Create PEL Business Rules instead of Excel formulas wherever possible.  Reusability and centralization of calculations.
  • Design Time Matrix - Set filter with All checked in order to use formulas with multiple filters.
  • There are no CAST or CONVERT statements in PEL.  If you want to perform math on member properties, use NativeMDX or NativeSQL or Assumption Models.
  • Create a Scenario called Budget Variance, can use Budget - Actual.  Create definition rules.  Create 5 variances, % of revenue, ForEx variance, % of sales, etc.
  • Definition rules are always in order of solve order.  The last one orders the calculations.  Make sure definition rule TimeDataView Calculation is at the end in order to perform YTD calcs against calculated members.
  • Keep rules small.  Use only single This & Scope for simplicity.  Nest RuleSets... but don't ever go more than 1-2 down.  Performance would be degraded.
  • Rule Set - 17 Rules in 3 folders - put them in 1 RuleSet, Create 1 Job that runs the 17 rules.
  • Comparison to last year rule
  • Useful function this = ($CurrentPeriod$.Lag(0)) - ($CurrentPeriod$.Lag(12));

Data Input

  • Use a numeric signoff code column to ensure that users have reviewed the budget and are comfortable signing off.
  • Include workbook checkbox - Sometimes &ref! is stored in the boxes when corrupted.  Ask users to save copy of workbook.  Performance issues with 'include workbook'.  Avoid.
  • PerformancePoint includes annotations, which are different than Excel comments.  Annotations get saved in the cube.  Use View - Annotations in PerformancePoint menu option.  Orange cells have annotations.  Right-click on each orange cell to check for annotations.  Users can only see their own annotations.
  • You can use data validations inside a matrix.  For signoff, you have a list for 1 - I agree as is, 2 - made changes, Excel will stop change.
  • If 15 assignments are submitted to 1 contributor, separate assigments are available for rejection.  The aggregated view is also available.
  • Use stored procedures to create member set out of member properties as parent/child hierarchy instead of member property.  There are limitations to member views vs. member sets. 
  • Use member properties within Filters of the reports to perform cascading filters.  Eg.  Create a department member property in the employee dimension.  Assign employees to departments.  Assign filter to property.
  • Ensure Reports directory is read-only for everyone but modelers.
  • Create a separate section with "Suggested changes" with read/write access or use Sharepoint Library.  Create Groups for Detail & Summary reports.
  • Remove blank columns and remove blank rows in the Report Properties - Matrix properties -
  • Intersections - Create asymmetric reports using Row/Column interesections with multple different rows/columns.
  • When copying Matrices during authoring, you will need to re-add the properties within the matrix as they get lost during a copy.

Planning - TechNet Forums

Tips from Train the Trainer Sessions – Day 3 (cont’d)

Security Nuggets

Security is “different” in PerformancePoint Planning than in SQL or NTFS. 

  • Models have a global On/Off setting for access by role.  Turn this on first.
  • Roles are not cumulative.  If one role has NO ACCESS, then the user has NO ACCESS.
  • There is an operational report that describes security information for a model.
  • Use Mirror Migration to take SQL Backups and Restorations to move from Staging to Production.

Mirror Migration

  1. Synch to Staging (backup data) for models and dimensions.
  2. Backup the Staging Database to another database location.
  3. Perform Mirror Migration.
  4. Restore Staging database data.
  5. Synch Staging to Production.
  • Customizing Member Permissions means that you will need to add users to new dimension members when members are added.  This is because it converts from 'All members' to a specific set of members.
  • If you define custom security for lots of different dimensions, performance is impacted.  Ensure you use only the minimum set of dimension security.  Locking scenario is not necessary as users can only input the scenarios specified in the assignment.  As soon as you switch from 'All Members' to custom this will impact performance.
  • Use SSL for PPS Service.  IPSEC.  Use a hardware SSL accelerator.  Ensure your data is protected.
  • For accounts with just read access - created a 2-layer, input at the leaf, in the report only show the parent.  Add dummy records so that leafs are not shown in reports.  Duplicate the leaf members but do not show.  Populate data at the leaf.   This way you can enforce security by not allowing access to leaf level data in reports without getting the security impact.
  • Coming in V2? - Give assignment to certain roles instead of users - propogate properly through the emails & user name tree.  Generate assignments by group rather than users.  Multiple roles in 1 assignment.
  • Assignments are on a per-user level.   One assignment per user.  That means 300 assignments will need to be created for 300 users.
  • Bug - Save button doesn't grey out after Save, or pencil checkin doesn't disappear when saving roles.  Roles are not ‘checked in’ so pencil is invalid anyway.  Save immediately saves role information to cube.
  • When status has Calling Dispatch Manager - Refresh model.
  • When creating roles, save after each role and refresh modeler.
  • When creating cycles, save after each cycle and refresh modeler.
  • When creating assignments, save after each assignment and refresh modeler.
  • Regular Refresh of model = happy modeler.
  • Process Management and Security sections do not have check out functionality.  Make sure only 1 person is working in these sections at a time.
  • Assignment Schedules - Create recurring assignments, use the powerful functionality of days after cycle start date and assignment start date to schedule different groups of assignments.  
  • Cycles = Same Scenario, Time period, Model.
  • Assignments = Can have multiple assignments in same cycle.  Not usually a good idea.
  • Reviewer/Approver can enter/modify data.  Assignments can be rescheduled.  Assignments can be set to a 3-hour window, and then rejected.
  • Best Practice - Make the cycle a bit longer than the assignment - Grace period.
  • Assignment Importance - It changes importance on email notifications, however there are problems with the built-in email notifications.  It is recommended to turn off notifications and use manual email notifications for assignments.
  • Changing security is affected immediately.  No assignment changes or deployment changes are necessary.
  • Create domain users to test with roles.
  • Use multiple modeler accounts.   Remove them from modeler role.  Add them to test role to test input permissions.
  • Create a fake placeholder member in each account, eg. “Please select a geography…” - leaf level member of dimension - Read access to that member, set all access to this member.   Set to this member by default for filters.   Create Please Select... dummy record in each member.  This will improve performance of the matrix in Excel Add-in as it will not refresh data until all dimensions are selected.  
  • All users have to have read access to all Account members.  You have to design report without salary line or use separate models when dealing with separate information like salary.  You can also create an alternate member set without Salary, and then Create a member set with Salary.
  • Create a form with a Member View.  Filter Salary when Member View.  Leaf-level property has to be same name as account.  For ones that you don't want to write, make leaf-level property different name then account.
  • Use fake data for IFRS - SOX requirements - when dealing with Salary information.  Modeler should not have access to data.
  • Create a property on Account dimension - Restricted.  Assign restricted accounts.  In Report Properties for Filters, Setup Filter to only show unrestricted accounts in the filter.  Set True Hide & True Lock on filter.  

Debugging Rules

  • Try allow drillthrough.  Great tool for debugging rules.  Right click and see fact records in table.  Great for troubleshooting rule results.
  • Use the Debug command on a rule to see what SQL or MDX will be generated.\

Forms Design

  • Turn off merge/center.  Turn on auto-indent row members. Auto-indent members will only do left column.  Totals for group can be at top or bottom. 

 

Planning - TechNet Forums

Tips from Train the Trainer Sessions – Day 3

More nuggets:

  • You can use Associations in your model to clear data from a model using Full Replacement option.
  • Almost always use Scoped Replacement when running associations against a model.
  • Select False to validation will ignore unknown member data.
  • The record count shown for ‘deleting records’ with associations isn’t the actual record count, just the possible count.  So when you see 108 records deleted, there could be 108 possible combinations.
  • When dealing with loading hierarchies in SSIS, look at the H_ tables.
  • Always perform data manipulation in the staging database.  Never adjust the production database.
  • Associations has an exclamation point in the icon for Actions.  This doesn't mean anything.  Don’t worry about it.

 

  • Any user in a business role can be made the owner of a job and then they can run the job without being a modeler.   This does not include data movement jobs, where you need to be a data administrator.
  • Every night or on a scheduled basis, you can automatically run jobs to aggregate data up to summary info - balance sheet, income statement, currency translations.  All data in any currency. 

TechNet Forums

Microsoft Business Intelligence Blog - Proclarity & SQL 2008

Let's hope they get a similar update for PerformancePoint.

If you want to get ProClarity working with SQL Server 2008, you need the following hotfixes

http://www.microsoft.com/downloads/info.aspx?na=22&p=6&SrcDisplayLang=en&SrcCategoryId=&SrcFamilyId=&u=%2fdownloads%2fdetails.aspx%3fFamilyID%3ddc9cbe69-1db1-4f5d-9b48-06b156c0f2c6%26DisplayLang%3den

http://www.microsoft.com/downloads/info.aspx?na=22&p=9&SrcDisplayLang=en&SrcCategoryId=&SrcFamilyId=&u=%2fdownloads%2fdetails.aspx%3fFamilyID%3dced0bd47-7d19-40de-8e80-f924784bd7af%26DisplayLang%3den

Microsoft Business Intelligence Blog

Tips from Train The Trainer Sessions – Day 2


More Nuggets:

  • When you need faster access to data within the model, sometimes it makes sense to build relational reports directly off the Application Database using tools such as PerformancePoint M&A or Reporting Services.
  • When loading staging data, you may have to run the validation stored procedure twice, if you are importing annotations.  The parameters are mutually exclusive, so you will either validate Annotations or Members.
  • @AnnotationOnly parameter - F - validate measures only.  T - validate annotations only.

  • When validating data loads, you need to check bizsystemflag, bizvalidationstatus, bizsystemerror columns.  Any one of these columns could have error information while the other columns could be null.

These styling tips are probably the most powerful nuggets I heard from one of the experts in the field:

  • Cell Styles – you can modify the pivot styles for changing the format of currency values and the look and feel of the PerformancePoint reports and data input forms. 
  • In Excel 2007, click the office icon and select Excel Options.  Under the popular section, choose show developer tab in the ribbon.  Record a macro.

    Select Home tab.  Choose format as table.  Select a new Pivottable style.  Create the style.  Each component of the style affects a different component in a PerformancePoint report/data input form.

    Review the macro.  Note these lines.

    With ActiveWorkbook.TableStyles("PivotTable Style 1")
        .ShowAsAvailablePivotTableStyle = True
        .ShowAsAvailableTableStyle = False
    End With

    You’ll want to modify the macro so that ShowAsAvailableTableStyle is set to true and rerun.

    This should allow you to create Pivot Style templates that should show up as Matrix Styles in PerformancePoint.

     

  • Conditional Formatting
  • Name your Account members with a % in the name, then use conditional formatting to format values as percentages.  This could also work with a ‘hidden’ member property.   Create a member property called ‘hidden’ with a 1 or 0 and use conditional formatting to hide the rows.

    Any accounts or members with ‘total’ in the name – use a conditional format to increase the size-bold the contents.

     

  • Importing/Exporting Forms can only be performed from the same Application – unless you manually edit the RDL of the form.
  • Use naming convention standards for objects.  Cycles should be named based on Role/Scenario/Time Range.  Eg FY09-Budget-Sales-Input
  • When installing PerformancePoint, run the installation as the service account you will be using to run the service.
  • Always set cycle dates to at least 1 day after the current date, then manually instantiate the cycle afterwards.  This solves issue with waiting for cycle to appear automatically.
  • Always use roles instead of users when assigning approvers/contributors.
  • Always set assignments to Move to Submitted or Not Submitted State instead of Do Nothing.  The preference is for Not Submitted, to see which planners have not completed their assignments.   Either option will close the cycle, not allowing changes to the plan.  Otherwise the Do Nothing option will allow modification of the assignment after the cycle end date.
  • Never regenerate assignments during a cycle if you can help it.  These assignments are tracked by user id which may change after a regeneration, making it hard to track history of the assignment.
  • Use 1 assignment for each form rather than multiple forms in an assignment.
  • Performance in the Excel add-in becomes an issue sometimes after adding > 4 matrices.
  • You can only fit 6 assignments on the assignment list in Excel add-in.  You need to click ‘more assignments’ and search for the assignment to check for additional assignments.  It may make sense to display assignments in a Sharepoint page or Reporting Services report.
  • Always use Submit Draft for first revisions.  Submit Final is closed for review or approval.  The assignment would have to be rejected if any changes are required.  The submit final option can be performed once the budget cycle is closed.
  • Add parameters to rules to zero-out data before an allocation rule.  Otherwise allocation rules will add to existing data, potentially duplicating results.  (Allocation – Add)
  • Set the available time members within a rule using the special Time drop down list.  Otherwise use the current period reference and a range of members before and after current period. (lag function)
  • Create referential integrity within dimension properties by assigning references to other dimensions.  Adding new properties will check data within the other dimension.

 

  • Within PEL (PerformancePoint Expression Language) both MDX and SQL comment syntax is valid.
  • Eg. /* */, –, //

     

  • Native SQL Rules can be used to run SQL Agent jobs.  This provides a tool that could be used by planners and modelers within Excel to run ETL packages, process the cubes, and perform many other tasks without requiring direct access to SQL Server.  Even if you are not using PerformancePoint functionality, this may be useful for integration of Excel with SQL Server for other purposes.
  • Create a Version dimension to track what stage in the Cycle the data is at.  Save version by User-Scenario-Model. 
  • Use Time.CurrentPeriod.Parent.Parent to get to higher levels in the time hierarchy for assumption models.

Planning - TechNet Forums

Loading Data for Sample Apps

Nugget: Many of the sample labs for PerformancePoint suggest loading data from CSV files.

If the data already exists in another application database, why not try and create a Data Source connection in PerformancePoint administrator against the Staging or Application database dimension tables instead?

TechNet Forums

Member Set vs. Member View

From Books Online:

Member Set (no mapping)

An additional way for you to manage dimensions by organizing dimension members.  A flat member set has its members organized as siblings or a member set can be organized into a hierarchy as parent/child members. You must deploy your model site after defining or changing member sets to update the dimension

When you deploy a member set, you are actually deploying a copy of the dimension with a different structure.  These sets can be used to set granularity of the fact data within a model/cube.  You can have only one member set for each dimension assigned to a model, though you can create multiple member sets in each dimension. 

For things like Exchange Rates, it may make sense to have separate global assumption models for Annual and Monthly exchange rates for granularity.

You may also want to sort the data in the dimension a different way, which is where a member set comes in handy.

Member View (no mapping)
Hierarchies of dimension members that are grouped according to member properties and they provide an alternative way to create reports on the members of a dimension using dimension member properties for analyzing groups of related members.

Member View is an essential feature of the Planning Business Modeler. It allows you to view a hierarchical order of the member properties and to get detail data related to it in a comprehensive manner. Member View is an option in Dimensions that helps to create reports, but not build models. Member View also enables you to calculate the sum of all the dimension members within the property. For instance, A car manufacturing company might have a product dimension, that has its members like sporting cars, family cars and so on. Here properties can be All Sporting Cars in Red. You can make use of the Member View to view the hierarchies of the dimension members that are grouped according to their properties.

With the application of Member View you can find the total sales of sporting cars at a particular level of dimension. Dimension members are the primary data for creating models but Member View is not used for building models. To create member views, you need to go to the Site Browser pane and select Dimension. On selecting a particular dimension, find Dimension Tasks to create a Member View. Just name the Member View in the dialog box provided and label it to identify later. The Description box should be filled with a short description of the Member View and you need to select a Member View. Add a property to the Member View and from here you can select the hierarchies for the properties. You can also select the Hide Members from View option if it is required. Finally click OK to save the Member View.  

A member view is a subset or summarized version of the dimension members used in reporting.  Member views are grouped by member properties.  These are used to calculate sums and define hierarchies against member properties.  Eg. Region - Country – City.

TechNet Forums

Monday, August 25, 2008

Microsoft BI Newsletter – August 2008

 

Welcome to the 5th issue of the Microsoft Business Intelligence Newsletter! The 'Spotlight' article for this August issue features BusinessWeek Research Services Report: "The ROI of Pervasive Performance Management." This downloadable whitepaper includes in-depth telephone interviews with executives at large and midsize companies known to be employing Performance Management and using Business Intelligence tools. And, of course, we have included great customer success stories, partner-ready tools and product news. We hope that you find the newsletter to be a valuable resource and we encourage you to share it with your colleagues!

http://www.microsoft.com/bi/newsletter/BI_Newsletter_August08.htm

Financial Systems and Business Intelligence

Paul has mixed results with Management Reporter, and some response from the product team that performance issues will be addressed in V2.

After months of testing PerformancePoint Server Management Reporter in it's current state, I have concluded that it will not handle the volumes that our company requires.

Financial Systems and Business Intelligence

One workaround could be to allow the users to design their own reports, and then have a promotion process to translate the reports for performance and scalability using summarized queries and tools such as Reporting Services and PerformancePoint M&A.

Of course, this gets rid of the idea of ‘self-service reporting’.

Perhaps quantifying the volumes that Paul’s company has could tell us more about MR’s limits?

pass@word1 and localization

Nugget: pass@word1 is really tough to type with certain international keyboard layouts and Virtual PC.  Use pass$word1 instead, or better yet a strong-password that isn’t readily obvious.

3. The New User dialog box appears. Enter the following information for the new account:
User name: PPSAdmin
Password: pass@word1

Establish. Execute. Evolve. ®: On the Rational Guide to Planning with PerformancePoint Server 2007

http://blogs.msdn.com/robcaron/archive/2007/01/02/1400971.aspx

How We Did It: blueKiwi SharePoint Connector and OfficeAssistant | SharePointed

Social Networking plugin integrated with PerformancePoint and Sharepoint.

Investing in the SharePoint Platform

As we look towards the future, we plan on investing further in integrating with SharePoint’s records management capabilities so that any to all blueKiwi conversations can be retained as corporate records. We will also include several Community dashboards that leverage SharePoint’s KPI Web Parts and Report Center functionality as well as the integration of our social metrics into balanced scorecards built using Microsoft PerformancePoint Server.

How We Did It: blueKiwi SharePoint Connector and OfficeAssistant | SharePointed

MSDN Blog Postings » Blog Archive » PerformancePoint Server Security and Kerberos

 

Here is a list of and links to our existing security documentation — organized according to our PPS book categories (Architecture and Planning, Deployment, and Operations):

Architecture and Planning

Deployment

Operations

MSDN Blog Postings » Blog Archive » PerformancePoint Server Security and Kerberos

This post originated from and is provided by the MSDN Blogs RSS feed. The original post of the article can be found here.

Microsoft Business Intelligence Blog : ProClarity (PAS) and SQL Server 2008

 

ProClarity (PAS) and SQL Server 2008

If you want to get ProClarity working with SQL Server 2008, you need the following hotfixes

http://www.microsoft.com/downloads/info.aspx?na=22&p=6&SrcDisplayLang=en&SrcCategoryId=&SrcFamilyId=&u=%2fdownloads%2fdetails.aspx%3fFamilyID%3ddc9cbe69-1db1-4f5d-9b48-06b156c0f2c6%26DisplayLang%3den

http://www.microsoft.com/downloads/info.aspx?na=22&p=9&SrcDisplayLang=en&SrcCategoryId=&SrcFamilyId=&u=%2fdownloads%2fdetails.aspx%3fFamilyID%3dced0bd47-7d19-40de-8e80-f924784bd7af%26DisplayLang%3den

Microsoft Business Intelligence Blog : ProClarity (PAS) and SQL Server 2008

The Microsoft ProClarity Team Blog

 

Thanks to Dan Wilmot for this whitepaper.

Many users need the ability to run a query against an SSAS cube that includes a date range. Typically this date range consists of a “start date” and an “end date” selection. The default behavior of dimensions in the cube makes this type of selection difficult. This paper discusses how to design your cube to include a new “Start Date” and “End Date” dimension, then how to add those dimensions to the cube, and finally how to build a named set that represents your date range.

The Microsoft ProClarity Team Blog

Thursday, August 21, 2008

Train the Trainer - Helpful tricks

I am taking a course in Denver on PerformancePoint Expert Training. The contributions by the attendees to improving knowledge around planning has been even more valuable than the training itself.

I will try and post some of the nuggets here to assist with deployments. (Since I'm in Denver, nuggets sound appropriate to use)

Some are more process-oriented than PerformancePoint specific.

Nugget #1 - Make the cycle a bit longer than the assignment - This provides a grace period.

Nugget #2 - In Modeler, Save and Refresh are your friends. Perform these often.

Nugget #3 - Dimension-level (member level) custom security hits performance in Excel filters and queries, and when directly accessing the cubes. Create a "Please select..." member in each of the dimensions. Use this as the default selection in your reports/input forms that require security. This will make the users select all filters that are security-dependent, and should speed up initial queries.

Finding IDs in a database

PerformancePoint uses IDs in the application database which makes for interesting challenges when trying to trace what is going on using SQL Profiler statements.

Here is a script to search databases for a text term which can be easily modified to search for int and bigint data types and use = instead of like clause in the search criteria.

http://vyaskn.tripod.com/search_all_columns_in_all_tables.htm

Performance Tip #9825 - Add indexes & schedule a job to run

Examine indexes in the AsyncWorkItems tables

It is important to be careful with the AsyncWorkItems and AsyncWorkItemStatusHistory measure group tables. These tables are used to queue the jobs that are processed by the Planning workflow engine and therefore will have frequent insertions and deletions being done. Because of this, these two tables experience rapid and severe index fragmentation. Your database administrator must monitor the fragmentation percentage closely and rebuild the indexes more frequently than the other tables.

http://technet.microsoft.com/en-us/library/bb794627.aspx

How to automate index maintenance
The following script checks for all indexes on a Planning Server database and finds those that have an average fragmentation more than 10 percent. Then, depending on the degree of fragmentation, it performs a reorganize index or a rebuild index for those indexes.

http://technet.microsoft.com/en-us/library/bb838727.aspx

Associations - Add new members

When you create an Association and there is no corresponding dimension mapping in target, you need to select all of the source dimension members. There is a Select All option, however this just creates a list of members to copy.

What happens when you add a member to the dimension? You need to update the association.

Instead of going into the modeler, this appears that it can be automated by adding to the AssociationDimensionScope table with some creative NativeSQL rules to pull from an associated D_ dimension table by label for labels not in the association. Or you could create a trigger on the dimension table... but triggers can cause problems so this isn't recommended.

Apparently this will be adjusted in a later version currently in testing to convert associations to PEL language, allowing for .members syntax rather than hard-coded associations.

Wednesday, August 20, 2008

Dissecting the model XML - Part 3 - GIGO

Usually your data will reside in CSV files, spreadsheets, or external databases. Dimension data will be loaded using CSV (1-time) or SSIS (maintainable and scalable).

You will probably use SSIS and the long-winded stored procedures within the staging database to create your label tables and stage the data, and some commands in the model or PPSCMD to stage the data, rinse and repeat.

It is a bit confusing for even the seasoned ETL developer. MS released a data import wizard that may help a bit.

Instead of using CSV or the wizard, I suggest you load your source dimension data into a SourceDB database. Then add tables using the Data Source tab to expose them to the Modeler application. You can even export the data back to the Source database after modifications in Modeler, using the Data Export job and a Data Export ruleset.

If you really need CSV files, you could try creating views on top of them using a file share on the SQL Server and linked server. Not sure how many compliance rules that one breaks though...

http://performancepointing.blogspot.com/2007/09/how-to-load-performancepoint-dimension.html

Data to be loaded into models (eg. Actuals, historical forecasts) requires using either the new Data Wizard or some custom packages or other methods of getting the data to the measure group staging tables. There is no way out of the box to to import fact data into the model. You could create some NativeSQL rules to run an SSIS job from a SQL Agent if you need your users to import data themselves.... hint...

Even though you're out of luck with Actuals and the Data Source, you can Load dimension members from Source and map to a table specified.

Back to the XML:

This section could be blank if you don't setup the Source and Destination data sources in the admin web.

Some self-explanitory nodes.

-
-
-
StagingDB


Your type could be Data Source View (DSV) or Database.

Database

Is it a Source or Destination?
true

Is it enabled for access by Data Administrators?
true

What's the SQL Server name?
PPSTRAINING

What's the SQL database name?
FBP1_Staging


What tables are in the database? By default, none are exposed to the Data Source.

I am using the Staging Database as a data source (probably not a good idea since it can get modified by the Modeler).

-
A_Profit And Loss_MeasureGroup
A_Profit And Loss_MeasureGroup_Label
A_Sales Assumptions_MeasureGroup
A_Sales_MeasureGroup
AG_AccountClassification
AG_AccountType
AG_DebitCredit
AG_EntityType
AG_ExchangeRateType
AG_FlowType
AG_OtherDimAggregation
AG_TimeAggregation
AG_TimeBalance
D_Account
D_BusinessProcess
D_ConsolidationMethod
D_Currency
D_Customer
D_Department
D_Division
D_Employee
D_Entity
D_ExchangeRate
D_Flow
D_Geography
D_InterCompany
D_Location
D_Product
D_Scenario
D_Time
D_TimeDataView
D_User
DBSchemaVersion
FK_Relationships
H_Account_Profit and Loss
H_BusinessProcess_Standard
H_Customer_Customers
H_Department_Departments
H_Division_Divisions
H_Employee_Employees
H_Entity_Entities
H_Geography_Geographies
H_Location_Locations
H_Product_Products
H_User_Approver
H_User_Reviewer
MG_Profit And Loss_MeasureGroup_default_partition
MG_Sales Assumptions_MeasureGroup_default_partition
MG_Sales_MeasureGroup_default_partition
ModelSiteTree
NS_Account_Profit and Loss
NS_BusinessProcess_Standard
NS_Customer_Customers
NS_Department_Departments
NS_Division_Divisions
NS_Employee_Employees
NS_Entity_Entities
NS_Geography_Geographies
NS_Location_Locations
NS_Product_Products
NS_User_Approver
NS_User_Reviewer
SystemErrorDetails
test


These tables are available to the Modeler, provided they have permissions to access.

-
MG_Profit And Loss_MeasureGroup_default_partition_Label



Here's our Destination datasource. Note that you should Create database and copy the 2 tables (BizSnapshotStatus and DBSchemaVersion) into any database you want to use as a destination, otherwise it won't get recognized as a valid destination.

I haven't created any tables in the destination yet but they would show up similar to above.

-
OutputDB

Database
false
true
PPSTRAINING
OutputDB







It would be useful if more source and target data sources could be used... perhaps ODBC or .NET providers?

Advantages and Disadvantages of PerformancePoint

Tom Stagg gives us some good points on where PerformancePoint shines, and where it falls flat.

http://tinyurl.com/676qho

Good planning resources

Yossi & Eran have some great links for PerformancePoint Planning resources. I've added the link to the Bloggers section.

http://blogs.microsoft.co.il/blogs/BeI/

Since there are a lot of you out there looking for away in, I've decided to gather all my known links and recommendations to help you take your first steps. I've decided to call it the "Junkies List":
Taking the first steps: Adrian's Impressive Intro Tutorials (1-7):
http://blogs.technet.com/datapuzzle/archive/2008/03/21/help-how-to-guides-manuals-forums-for-microsoft-office-performancepoint-server-2007.aspx
Level 100 – 200 : PPS – Planning On-Line raining
http://www.microsoft.com/business/performancepoint/resources/training.aspx
Basics Concepts:
http://blogs.adatis.co.uk/blogs/sachatomey/archive/2008/04/28/new-microsoft-insider-pps-planning-blog.aspx
XL-Add-ON
http://blogs.msdn.com/petereb/default.aspx
Centeral Blog:
http://adriandownes.blogspot.com/
Planning Server Forum:
http://forums.microsoft.com/TechNet/ShowForum.aspx?ForumID=1871&SiteID=17
Paul Steynberg
http://paulsteynberg.blogspot.com
Nick Barclay
http://nickbarclay.blogspot.com/
Jason Morales
http://blogs.msdn.com/jasonmorales/default.aspx
Peter Ebhttp://blogs.msdn.com/petereb/about.aspx
Sacha Tomeyhttp://blogs.adatis.co.uk/blogs/sachatomey/archive/2008/04/28/new-microsoft-insider-pps-planning-blog.aspx


Dissecting the Model XML - Part 2 - The Unchangeable Properties of Time

Within the model XML there are many tree nodes.

Under BizModelSite root node:

CalendarMetaData
DataConnections
Dimensions
DimensionAttributeGroups
Models
SubModelSites
WorkFlowTables

This posting will examine CalendarMetaData.

Time is very important in a PerformancePoint application. It affects Cycles, display of the time dimension, granularity of time within models, display of time in forms and filters, etc. etc.

The problem is, once you Create The Calendar, that is basically it. There is no way to reseed the data with different calendars, create multiple calendars, or change the start date of the calendar, without modifying the underlying database and probably the XML here and most likely other places. You can only extend the calendar for additional time ahead and create Calendar Views for different granularity settings.

There are also steps to take to ensure that locale changes affect the calendar correctly.

It is a bit messy if done wrong, since you'll have to start over.

When dealing with Balance Sheets, be sure to start the time period before the first time range you would like to see in your filter list. This allows for entry of historical closing period balances.

Test the different calendar modes to see if you need Fiscal calendars, etc. Review the documentation. Check out the sample apps in the AllUpBI VPC file MS provides. Create a test application, create the calendar, and prototype the time dimension in a form. You should find out fairly quickly how it is displayed and whether it is acceptable for your requirements.

About the application calendar
http://msdn.microsoft.com/en-us/library/bb839150.aspx

Alpine Ski House documentation
http://office.microsoft.com/download/afile.aspx?AssetID=AM102421841033

About the 445 calendar, "Annus Confusionus"

http://personal.ecu.edu/mccartyr/clash.html

The XML didn't turn out so well, will have to see about fixing in future posts.

There is a lot of information stored in the CalendarMetaData, not sure what some of the properties are however most are intuitive:

Generating applications with the various calendars may show differences that would allow for manipulation of this XML to change the calendar, though is much safer just to recreate the application.

Dissecting the Model XML - Part 1

Download the XML Notepad tool.

http://www.microsoft.com/downloads/details.aspx?familyid=72d6aa49-787d-4118-ba5f-4f30fe913628&displaylang=en

Save the results of the query in query analyzer. (from the application database for the model).

-- export latest model version
select top 1 cast(cast(cast(xmlblob as varbinary(max)) as varchar(max)) as xml)
FROM dbo.BizAppNodes
order by bizappnodeversion desc

First node.


Fabrikam Inc Planning 1


PPSTRAINING
0

Dissecting the node.

Id - a unique ID used throughout the tables in PerformancePoint.
ParentID - the unique ID of the parent model site.
BizTypeIdRef Id - the type of model. Stored in the PPSPlanningSystem database BizTypes table. It's a unique identifier. This makes it very tough to move your XML to another server, since the GUID may not be unique. Should confirm this.
OlapServerName - Machine name of the server. Defined in the Administrator web page.
DeployedId - Has the version been deployed. Since we took the last version it hasn't. However this doesn't appear to be used as the samples I checked were all 0.

Next entry - Dissecting the Model XML - Part 2 - The Unchangeable Properties of Time

Modifying the model's XML data

PerformancePoint Planning stores the bulk of model metadata in a large, bulky image blob. This blob is actually editable XML. The trick to reading this is to convert the data in the column.

-- view latest model version as XML
SELECT TOP 1 cast(cast(cast(xmlblob as varbinary(max)) as varchar(max)) as xml)
FROM dbo.BizAppNodes
ORDER BY bizappnodeversion desc

-- cast XML back to the image blob
SELECT TOP 1 cast(cast(cast(cast(cast(cast(xmlblob as varbinary(max)) as varchar(max)) as xml) as varchar(max)) as varbinary(max)) as image)
FROM dbo.BizAppNodes
ORDER BY bizappnodeversion desc

You may need to adjust the options within Query Analyzer to return larger blocks of XML. Tools - Options - Query Results - SQL Server - Grid - Unlimited XML.

This may make it possible (though it wouldn't be recommended) to change the XML in the model and report against it with external tools. Backup the databases first!

PerformancePoint Language Tool Released

This tool will allow the PerformancePoint client and Management Reporter to change the language at run-time rather than install-time.

http://msdn.microsoft.com/en-us/library/ms998562.aspx

Performance Tip #9623 - Implement Web Service performance improvements

Some of these may not be possible to change since they're at the code level.

http://msdn.microsoft.com/en-us/library/ms998562.aspx

Tuesday, August 19, 2008

Monday, August 11, 2008

Whitepaper - Microsoft® Office PerformancePoint™ Server 2007 Design Basics

Excellent "best practices" guide to designing a Planning application in PerformancePoint.

Applies to: Microsoft Office PerformancePoint Server 2007

This document provides guidance and best practices for designing a Microsoft Office PerformancePoint Server 2007 Planning application.

http://office.microsoft.com/download/afile.aspx?AssetID=AM102830551033

Identifying the parameters for data population of a measure group for staging table

Populating actuals in PerformancePoint isn't necessarily trivial.

Once you have created the staging database and synchronized tables into the database, you need to modify the contents of staging to include dimensions, hierarchies, and measures.

Usually, actuals are the first thing to go into a planning application, as variance reports are important along with seeded data.

This table holds the key to parameters for the [dbo].[bsp_DI_CreateLabelTableForMeasureGroup] stored procedure.

SELECT * FROM FK_Relationships

eg.

[dbo].[bsp_DI_CreateLabelTableForMeasureGroup] 'AppRoot:FinancialWithShares', 'MG_FinancialWithShares_MeasureGroup_default_partition'

Once this table is created:

select * from MG_FinancialWithShares_MeasureGroup_default_partition_label

This shows what rows are required. The PPS Operation document has some examples of this.

Modifying our Actual population view.

alter view vActualsByCostCentre_Unpivot

as

SELECT 'ACTUAL' as [Scenario_MemberId_Label] ,[Time_Month_Label] ,[Account_MemberId_Label] ,'INPUT' as [BusinessProcess_MemberId_Label] ,'Open Text Corporation Canada' as Entity_MemberId_Label ,'PERIODIC' as TimeDataView_MemberId_Label ,'NONE' as Flow_MemberId_Label ,[Cost Centre_MemberID_Label] ,'NONE' as Department_MemberId_Label ,'NONE' as Product_MemberId_Label ,'USD' as Currency_MemberId_Label ,[Value] ,null RowID ,200 as [BizSystemFlag]

FROM(

SELECT [CostCentre] as [Cost Centre_MemberID_Label] ,[Label] as [Account_MemberId_Label] ,[July Year2007] = [Jul07] ,[August Year2007] = [Aug07] ,[September Year2007] = [Sept07] ,[October Year2007] = [Oct07] ,[November Year2007] = [Nov07] ,[December Year2007] = [Dec07] ,[January Year2008] = [Jan08] ,[February Year2008] = [Feb08] ,[March Year2008] = [Mar08] ,[April Year2008] = [Apr08]

FROM [ActualsByCostCentre]) [Value]UNPIVOT ([Value] FOR [Time_Month_Label] in ( [July Year2007] ,[August Year2007] ,[September Year2007] ,[October Year2007] ,[November Year2007] ,[December Year2007] ,[January Year2008] ,[February Year2008] ,[March Year2008] ,[April Year2008] ) ) AS UNPVT

Then

INSERT INTO MyApp_StagingDB.dbo.MG_FinancialWithShares_MeasureGroup_default_partition_label([Scenario_MemberId_Label] ,[Time_Month_Label] ,[Account_MemberId_Label] ,[BusinessProcess_MemberId_Label] ,[Entity_MemberId_Label] ,[TimeDataView_MemberId_Label] ,[Flow_MemberId_Label] ,[Cost Centre_MemberID_Label] ,[Department_MemberId_Label] ,[Product_MemberId_Label] ,[Currency_MemberId_Label] ,[Value] ,[RowID] ,[BizSystemFlag])
SELECT [Scenario_MemberId_Label] ,[Time_Month_Label] ,[Account_MemberId_Label] ,[BusinessProcess_MemberId_Label] ,[Entity_MemberId_Label] ,[TimeDataView_MemberId_Label] ,[Flow_MemberId_Label] ,[Cost Centre_MemberID_Label] ,[Department_MemberId_Label] ,[Product_MemberId_Label] ,[Currency_MemberId_Label] ,[Value] ,[RowID] ,[BizSystemFlag] FROM [MyApp_SourceDB].[dbo].[vActualsByCostCentre_Unpivot]

The last SQL step after populating the label table?

DECLARE @return_value INT

EXEC @return_value = [dbo].[bsp_DI_ConvertLabelColumnToMemberIDForMeasureGroup] @ModelName = N'MyApp:FinancialWithShares', @MeasureGroupTableName = N'MG_FinancialWithShares_MeasureGroup_default_partition', @OverwriteExistingData = N'F', @IncludeValidation = N'T',@IncludeAnnotation = N'F'

SELECT 'Return Value' = @return_value

GO

The modeler can now go into the application and Load this data from Staging Area.

TechNet Forums

PerformancePoint Server Consulting > Home

Rather than this approach, I would recommend creating an AppName_SourceDB SQL database for your application data and loading the CSV files into this source database.  The cut-paste method works, linked tables from Access, or the more robust SSIS ETL or your favourite ETL tool.

After the source database is created and tables populated, they can be added to the model by creating a Source Data connection in Planning Administrator.  Views in this SourceDB could be created to point directly to the source database, using linked servers and even synonyms for increased abstraction.  These views can be added to the Source Data connection.  The SourceDB acts as mapping warehouse and allows for ease of rerunning data and reuse for other purposes.

When the connection has been created, it can be loaded from the modeler by using the Load from Source Data

Dimensions

Load dimensions via .csv files:

The .csv file must adhere to the following format.

  • Cell a1 = ! dimension = dimension name
  • Cell a2 = ! hierarchy = All Members
  • Cell a3 = Label (value required)
  • Cell b3 = Name (value required)
  • Cell c3 = Description (value optional)

Values in the Label and Name columns are required. Values in the Description column are optional, but the header is required in the template.

PerformancePoint Server Consulting > Home

Give the New PIVOT and UNPIVOT Commands in SQL Server 2005 a Whirl

When dealing with importing Excel spreadsheets and cross-tab reports, UNPIVOT is your friend.

This will translate a column-based date data format to a row-based format which can be used in PerformancePoint.

create view vActualsByCostCentre_Unpivot
as
SELECT [CostCentre] as [CostCentreLabel]
      ,[Label] as [AccountLabel]
      ,[MemberID]
      ,[Value]
FROM
(
SELECT [CostCentre]
      ,[Label]
      ,[20070701] = [Jul07]
      ,[20070801] = [Aug07]
      ,[20070901] = [Sept07]
      ,[20071001] = [Oct07]
      ,[20071101] = [Nov07]
      ,[20071201] = [Dec07]
      ,[20080101] = [Jan08]
      ,[20080201] = [Feb08]
      ,[20080301] = [Mar08]
      ,[20080401] = [Apr08]
FROM [ActualsByCostCentre]) [Value]
UNPIVOT ([Value]
    FOR [MemberID] in
      ([20070701]
      ,[20070801]
      ,[20070901]
      ,[20071001]
      ,[20071101]
      ,[20071201]
      ,[20080101]
      ,[20080201]
      ,[20080301]
      ,[20080401])
  ) AS UNPVT

Give the New PIVOT and UNPIVOT Commands in SQL Server 2005 a Whirl

Useful resources – VHD Resizer & VMWare Converter

 

VMDK(VMWare) to VHD Converter

Vmdk2Vhd is a simple utility to convert virtual ha...

20 Nov 2006
75663
Download

File Size 48.1kB

VHD Resizer

VhdExpand is now VhdResize. VhdResize will resize ...

17 Jan 2007
39202
Download

File Size 170.7kB

Files

Christophe Fiessinger's Blog

 

How to extend the size of a virtual image (VHD)?

Let’s assume you have an EPM demo VPC with one disk and one partition of 19.53GB and you’d like to extend it to install additional software.
image

Download VhdResizer, open the VHD you want to update and set its new size to 25GB for instance (Dynamic type)
ScreenHunter_04 Aug. 06 17.01

  • Boot VHD using Windows PE (Windows Preinstallation Environment (Windows PE) 2.0 is a minimal Win32 operating system with limited services, built on the Windows Vista kernel) for instance
    Go at the command line and launch DISKPART
    LIST VOLUME (to identify VHD volume to extend)
    SELECT VOLUME n
    EXTEND
  • Done! (LIST VOLUME will show you the extended size)

ScreenHunter_02 Aug. 07 16.52
ScreenHunter_03 Aug. 07 16.54

Start the VPC and the virtual C: partition should now be 25GB (the physical VHD size should not change since it’s a dynamic disk)

Christophe Fiessinger's Blog

PerformancePoint Monitor & Analyze UDF Project (MAUDF) - Home

 

MAUDF Project
The MAUDF project is a series of CLR table-valued User Defined Functions that facilitate access to metadata for elements published to a specific PerformancePoint Server 2007 Monitoring and Analytics (M&A) instance. The aim of the MAUDF project is to make otherwise hard-to-get-at metadata more accessible through these functions. Because the data exposed is tabular this provides an easier way for system administrators and developers to surface useful information contained within a particular Monitoring server using just T-SQL.
UDF List

UDF
Description

ElementMetadata
Element level metadata common to all elements e.g. name, description, display folder

ElementMemberships
Lists each membership domain account the assigned privileges for an element

ElementCustomProperties
Lists information for all custom properties that exist on an element

KpiMetrics
Lists metadata for each actual and target metric within a kpi

KpiTargetThresholds
Lists detailed threshold settings for each target metric within a kpi

ScorecardMetadata
High level property settings for published scorecards

ScorecardKpis
List of all kpis referenced within a scorecard

ScorecardAnnotations
Lists all annotations and associated comments associated with a scorecard

ReportMetadata
High level metadata for published reports

IndicatorMetadata
High level metadata for published indicators

IndicatorLevels
Lists properties for each level defined within an indicator

DataSourceMetadata
High level metadata for published data sources

DashboardItems
Lists metadata for each scorecard, report and filter item referenced within a dashboard

PerformancePoint Monitor & Analyze UDF Project (MAUDF) - Home

BI Report Automation Publisher - Home

 

Current Features

PerformancePoint (PPS) Publishing
Moves elements from one server to another, adjusting server specific settings for related elements automatically including:
  • PPS Data Connections
  • Proclarity Analytic Server Report Links
  • Excel Service SharePoint Links
  • Excel Service Parameters
  • Add/Remove Credentials
Proclarity Analytic Server (PAS)
  • Update report page connection properties to redirect to a different SSAS server.
Excel Services
  • Dynamically change link to Office Data Connection file.
  • Provide a seamless way to create a list of Links to Proclarity Analytic Server reports
Configurability
  • Net configuration API is used, allowing flexible XML Configuration and validation.
  • Command Line tool execution, Output sent to tracelistener and is configurable.
  • Use MSBuild or WIX to inject properties into the Config file before execution, to truly integrate this in the build/deploy environment.

BI Report Automation Publisher - Home

Microsoft Dynamics CRM Team Blog : CRM Accelerators – Part I – Analytics Accelerator

 

CRM Accelerators – Part I – Analytics Accelerator

Last week we introduced the CRM Accelerators program for Microsoft Dynamics CRM customers and partners, each week we will unveil more details of each accelerator. In this article we feature the Analytics Accelerator.
Analytics Accelerator – Due to be released in Q3 2008

Many customers and partners will be familiar with Analytics Foundation for Microsoft Dynamics CRM 3.0 which was a business intelligence feature pack for Microsoft Dynamics CRM 3.0. Analytics Foundation for Dynamics CRM 3.0 included Dashboards, Key Performance Indicators (KPI’s) and a suite of new reports. Analytics Accelerator for Microsoft Dynamics CRM 4.0 expands on this, and is designed to support a range of business intelligence “maturity” levels for customers and partners. The Analytics accelerator has been built to cater for differing business intelligence skill-sets and experience levels. Elements of the accelerator include:

  • Sales, service and operations dashboards delivered with Microsoft SQL Server® Reporting Services (SSRS).
  • Sales, service and operations dashboards and scorecards delivered with Microsoft PerformancePoint® Server.
  • Additional reports delivered with SSRS.
  • Two Online Analytical Processing (OLAP) cubes which are designed to allow customers to conduct in-depth analysis and key performance indicator (KPI) monitoring based on Microsoft Dynamics CRM 4.0 data. Customers can consume these cubes through Microsoft Excel®, Microsoft SharePoint® technologies and Microsoft PerformancePoint Server. The first OLAP cube is designed for simple setup and consumption, the second cube is more detailed and requires a more advanced knowledge of analytics and OLAP technologies within Microsoft SQL Server.
  • Predictive analytics to provide statistical guidance in areas such as lead and sales opportunity management. There are several sample applications which ship as part of the accelerator that showcase how customers can leverage predictive analytics directly within the Microsoft Dynamics CRM 4.0 user interface.
  • The analytics accelerator will work with SQL Server 2005 with some components enhanced to take advantage of SQL Server 2008.

The Analytics accelerator will be free of charge for Microsoft Dynamics CRM 4.0 customers and partners.

Microsoft Dynamics CRM Team Blog : CRM Accelerators – Part I – Analytics Accelerator

Norm's PerformancePoint Server Blog : PerformancePoint Server Security and Kerberos

Some good links when dealing with the various complexities of PerformancePoint, SQL, IIS, and Sharepoint security.

  1. List and provide links to our existing security-related content.
  2. Solicit feedback regarding security-related content. For example:
    • What are we missing?
    • How can we improve what we currently have?
    • Anything else you would like to share with us?

Here is a list of and links to our existing security documentation — organized according to our PPS book categories (Architecture and Planning, Deployment, and Operations):

Norm's PerformancePoint Server Blog : PerformancePoint Server Security and Kerberos

Thursday, August 7, 2008

Orange Peel - Business Intelligence Studio for PerformancePoint

Not a lot of info on this yet.

Business Intelligence Studio (BIS) enables full utilization of all Microsoft Office PerformancePoint Server 2007 (PPS) features. It is a solution that accelerates the implementation of PPS, enables end user productivity and provides valuable budgeting/planning functionality to facilitate better Business Performance Management.

Orange Peel

Wednesday, August 6, 2008

Norm's PerformancePoint Server Blog – Training

I’ve got my plane tickets booked for the August 18 – 22 training in Denver.   Monitoring has received much wider adoption levels, probably due to the integration with existing Proclarity customers and product maturity.  Not sure how many Biz # (predecessor to PP Planning) customers there were out there…

Planning should be a good field for extended learning as there are a lot of opportunities for improvements in this space.

PerformancePoint Planning Server training

Hi there! My name is Joanne Hendrickson and I work as a writer here at Microsoft on the PerformancePoint Server Content Publishing team. I, too, will be a regular contributor to this blog discussing all things PerformancePoint. I am passionate about getting you the clearest, most valuable content to help you solve your business problems using clean, simple approaches. I try to frame a solution answering the "Why?" and find that type of adult-learning approach can serve us all well as we tackle the challenges in front of us.

One great way to get in the know about Planning is to attend a Microsoft Office PerformancePoint Planning implementation workshop that will be offered at Microsoft's Denver offices starting in September.

These are PPS workshops, solely focused on Planning, that last 4½ days. As the course description states, this course: "walks through setup of a budgeting solution for an enterprise-level entity. Company and business scenarios are presented to the students with requirements and budget assumptions, offering a discussion on how to prepare for a Planning implementation. As part of the class deliverables, students will create an application, set up dimensions, models and load data. Profit and Loss budget input forms are built, including sub ledger detail forms such as Human Resources and Revenue modeling. Business rules are defined to further customize the input forms. Students will set up Security and define processes to replicate a real-world user experience. Reports are designed in various formats to show the end results needed to analyze the planning information." Now really, how fun is that?! Sign up! Any partners are welcome. 

Here are the registration details:

In case your company has considered becoming a partner, see the Partner Program Web site.

Norm's PerformancePoint Server Blog

Norm's PerformancePoint Server Blog and another Performance Tip?

Norm has some good info on PPS vs. Proclarity.

 

Here is response I found to the following questions.

  1. What’s in PPS and not in ProClarity ?
  2. What’s in ProClarity and not (natively) in PPS ?

Norm's PerformancePoint Server Blog

Another tip for speeding up the PerformancePoint VPC?  Within the Planning Admin Console, there is a Planning Process Service option.  This lets you set the time to poll for queue checking.   The default is 60 seconds.  It has been set to 4 seconds in the VPC.  Decrease or increasing may slightly affect performance.

Tuesday, August 5, 2008

Tip #9002 – Refresh data without exiting Excel

From the forums.

There is a way to see the latest data without closing and re-opening Excel.

In the Excel add-in, go to Offline->Manage Offline Assignments. Then check the box of your assignment, and click the refresh button.

Re: MDX Time spreading rule - TechNet Forums

Howard @ MSFT selling BI: Spreading across Time

One workaround for spreading across time (years to months) in PerformancePoint.

Spreading across Time

Unsupported through the excel interface. Problem is when you create an cycle - assignment, there is a concrete data collection period. You must enter data in that range AND you must enter data at the leaf level. one workaround is collect data for the year, and stick in a month in the future. For example you want to spread 12 months of data for 2008. You create a form and assignment and collect the data and put it in Jan FY09. Then you write a business rule and you scope the 12 months with the ":" operator and the this statement points to Jan FY09. then you can divide the number equally by 12. You can get more creative with your spreading algorithm too. The only thing, is you have to submit the JanFY09 number, and then run a job that runs the rule. So its a two step process.

Howard @ MSFT selling BI: Spreading across Time

Improving Performance in Excel 2007

An important component to PerformancePoint Planning is the Excel 2007 add-in.  Here are a few ways to speed up Excel 2007 which may improve certain complex spreadsheets.

You can modify most slow-calculating worksheets to calculate tens, hundreds, or even thousands of times faster. The objective of this article is to review how you can speed up calculation by making the best use of the new features in Excel 2007, and by identifying, measuring, and then improving the calculation bottlenecks in your worksheets.

Improving Performance in Excel 2007

Deployment Failed - TechNet Forums

Tip #9001 – Full deploy of PerformancePoint model site. (from forums)

When you carry out a deploy of the model site within PBM, it only carries out an incremental deployment to AS. You can carry out a full deployment using ppcmd.exe, using the syntax:

ppscmd modelsitedeploy /full /server <planningserver url> /path <applicationlabel>.<modelsitelabel>

This will delete the AS database and re-create it. You may also find a bit more help in the PPS operations guide - http://technet.microsoft.com/en-us/library/bb838754.aspx

Deployment Failed - TechNet Forums

Friday, August 1, 2008

Callisto - Excel Add-in for PerformancePoint

Callisto could be a valuable replacement for the Excel Add-in for PerformancePoint.

It supports Excel 2000, 2003, 2007.

It supports loading, extracting, and clearing data with the Planning Model directly from Excel.

It has the ability to perform driver-based forecasting, and appears to remove some limitations of the matrix control.

Worth a look.

Callisto - Excel Add-in for PerformancePoint

More on PerlieLab

https://solutionfinderbeta.microsoft.com/SDK/Partners/PartnerDetailsView.aspx?partnerid=467f04c4ea924b69a9582cb094a710f8