Google
 

Thursday, July 31, 2008

Setting MAX SQL Server Memory value - SQL Server Performance

Boosting the All-UP VPC Performance - Set max SQL memory to something reasonable instead of unlimited.

Modify IIS to disable logging for PerformancePoint Planning site.

Turn off performance counters.

Configure Analysis Services memory limit.

 

Setting MAX SQL Server Memory value - SQL Server Performance

The Sean Blog : Virtual PC 2007 Released!

Sean's tips on VPC performance:

Now, in order to set this post aside from some of the other "Virtual PC 2007 has been released!" postings, I thought I would share some best practices relating to the usage of Virtual PC that I have picked up from doing demos over the last year or so (as well as from an internal demo optimization doc that made its way into my inbox a few months ago):

  1. I'll start with the obvious… the more RAM the better. You need to have enough RAM to run your host operating system, your guest operating system(s), plus an additional 32MB of overhead for each guest OS that you will be running. From my personal experience, 2GB of RAM is a good starting point.
  2. A CLOSE second to RAM is your hard drive configuration. If you are running your Host OS and Virtual machines from the same hard drive, you are in for a painful experience. For good performance you will want to move the .vhd files off to the fastest secondary hard drive possible. For a secondary internal hard drive, SATA or SCSI is ideal, and for an external HD, pick up an eSATA drive. This advice is even more important if your computer is a laptop, as they often have slow 4200 RPM hard drives that will struggle with the concurrent I/O load.
  3. If your Antivirus allows you to exclude files or directories from realtime scans, exclude the directory that holds your images (and/or exclude the following extentions: .vhd .vud .vsv .vfd .vmc)
  4. Keep the host computer hard drives defragmented.
  5. The smaller (or less fragmented) the VHD files the faster. Steps to shrink/defragment:

    1. In the VM run defrag (a couple of times). It might take a while (up to hours) but it is worth waiting for
    2. Run the VM precompactor – this is an ISO image. Just attach it to the VM . The precompactor will zero all the unneeded space in the VHD

      1. Load the 'Virtual Disk Pre-Compactor.iso' from the Virtual Machine Additions Directory
      2. Answer 'Yes' to start the zeroing
      3. Shutdown the VPC when done
      4. Use Virtual PC Disk Wizard to edit and compact the pre-compacted .vhd
  6. Use a processor with Virtualization Technology (Intel Page, AMD Page). This is any reasonably recent processor. This will greatly speed up the OS boot time, OS installation time, and performance in non-Windows operating systems (as well as Windows installations that to not have the Virtual PC Additions loaded). On guest Windows installations that have the Virtual PC Additions loaded, performance is significantly enhanced), which brings us to:
  7. Load the Virtual PC Additions. These are included with Virtual PC, and will need to be loaded in the Guest operating system. This will load up drivers for the virtualized hardware, allow for folder sharing (with the host OS), drag-and-drop functionality (from the host OS), synchronize time with the host OS, and generally turn on "Teh Snappy"
  8. Start up Virtual PC with the following option (Best way is to create a shortcut on your desktop): -usehostdiskcache From the command line reference:

    Turns on host-side disk caching, which can improve performance of virtual machines running operating systems other than Windows. This parameter can be useful for resolving poor performance problems with disk intensive tasks. This parameter can be used only when starting Virtual PC.

  9. Do NOT use the /3GB switch on your Host machine. That switch increases the user memory space at the expense of Kernel space. Virtual machines use more kernel memory than user memory – so enabling this switch on the host will decrease the amount of virtual machines that you can launch
  10. I'll leave you with a tip that would seem to defy logic (and which I have not tried myself), but the word on the street is that you can increase performance by enabling NTFS compression on the directory that holds your Virtual PC image files (.vmc, .vud and .vhd). It seems that writing a big chunk to disk is slower than first compressing the chunk and then write it to disk.

The Sean Blog : Virtual PC 2007 Released!

some AddIn for Excel SP1 performance improvements details beyond the readme... - TechNet Forums

Turn off what-if analysis in Excel forms and reports for improved performance.

What-is what-if?

When you set the Enable what-if analysis option to True for a matrix, PerformancePoint Add-in for Excel automatically performs what-if calculations that show the effect that the change will have on an assignment if it is submitted.

If this option is set to False, PerformancePoint Add-in for Excel does not apply any changes to the matrix. This can help improve query performance.

Some informal (but real world) feedback from internal deployments and some preview customers tell us that common refresh/what-if operations in SP1 are 50% of the time in RTM or better. In one extreme case which could take advantage of the option to turn of what-if the time was even 5%, but I don't think that will be typical.  Opening assignments sounds like SP1 is 50% to 33% of the RTM time. So far I haven't seen any negative performance feedback, knock wood, cross fingers, and any other anti-jinx superstitions.

some AddIn for Excel SP1 performance improvements details beyond the readme... - TechNet Forums

SQL Server 2005 Analysis Services (SSAS) Server Properties

This may cause issues with PerformancePoint as we don't have control over the processing job, but let's set it to false and see.

Lazy processing allows fast updates to the dimensions. After an update, SSAS processing of aggregations and indexes is initiated in the background. As a result of lazy processing, you might see Analysis Server consuming CPU and disk I/O at times when no user activity is performed. To avoid lazy processing, you can set the ProcessAffectedObjects property as part of a dimension update processing command, causing all aggregations and indexes to be processed as part of the same transaction.

SQL Server 2005 Analysis Services (SSAS) Server Properties

SQL Server 2005 Analysis Services (SSAS) Server Properties

Let's bump this one up from 64 to 128. Let's not... :) Either this or the other setting maxed out the memory in the VPC...

MaxThreads

MaxThreads is a server property that controls how many threads will be spawned, at maximum for either query or process operations.

SQL Server 2005 Analysis Services (SSAS) Server Properties

SQL Server 2005 Analysis Services (SSAS) Server Properties

Bumping up (or tuning down?) performance.

Set this to -5 or -6 perhaps? I'm running a dual-core laptop, so this might improve things. Probably not a good idea if you're using a VPC which appears to use only 1 CPU...

CoordinatorExecutionMode

CoordinatorExecutionMode is a complex SSAS server property that controls the number of parallel jobs that Analysis Server will start. When operations (such as querying and processing) occur in parallel, performance can be significantly enhanced. However, too many parallel jobs on a server can decrease responsiveness to other operations. The CoordinatorExecutionMode property’s value has special meaning as it controls the SSAS coordinator component, as follows:

Positive number. Sets the absolute maximum number of parallel jobs per server.

Zero. Enables the server to automatically determine the maximum number of parallel operations, based on workload and system resources.

Negative number. Sets the absolute maximum number of parallel jobs that can occur per processor (as opposed to a positive number, which is per server).

SQL Server 2005 Analysis Services (SSAS) Server Properties

SQL Server 2005 Analysis Services (SSAS) Server Properties

Under the Advanced Properties of Analysis Services, set OLAP\Process\CheckDistinctRecordSortOrder to 0. (At your own risk!!! :)

CheckDistinctRecordSortOrder

CheckDistinctRecordSortOrder is a Boolean server property that dictates how record sort orders will be handled. When processing distinct counts coming from a relational data source, Analysis Services needs to have records in a specific order to ensure accurate counting.

If collations are different between Analysis Services and the relational source, an error might occur if this property is set to True. On the other hand, if collations are known to be the same, performance can be enhanced by setting the property to False, but no checking will occur.

SQL Server 2005 Analysis Services (SSAS) Server Properties

Wednesday, July 30, 2008

Infosys | Microsoft: Tool for PerformancePoint Migration

 

Tool for PerformancePoint Migration

Microsoft has released the tool BI Report Automation Publisher to a wider audience. This is a command line tool with .Net library which makes life easier in moving PPS/ProClarity reports between servers.

Microsoft has built this tool mainly for PPS which can also be used for ProClarity and Excel Services reports to move to different environments. One can configure the server details as well. The tool is being enhanced by the Microsoft team add more features like

  1. PerformancePoint Target Location Redirection
  2. PPS Sql Server Reporting Services (SSRS) Report Support
  3. PerformancePoint Input from BWSX Files
  4. Deploying PAS Reports
  5. Deploying PPS Dashboard
  6. Updating Post-Deployed PPS Webparts

More details on the tool and source code is available in CodePlex . Its very interesting to see the number of hits on this site and the count on people who downloads this tool.

Infosys | Microsoft: Tool for PerformancePoint Migration

Tuesday, July 29, 2008

Speeding up the All-up BI VPC

There's a lot of stuff on the MS BI VPC.  Server operating system.  SQL Servers & Analysis Services & Reporting Services.  Sharepoint.  IIS.  PerformancePoint.

It's a wonder it even starts up at all.

I'm going to focus on a few ways to speed up the All Up VPC, besides adding more RAM, CPU, and fast disks.

First, get the latest All up BI VPC from MS downloads.

All-up BI VPC

For folks who want to play with our Business Intelligence technologies and don't want to bother with installing everything, you can download a virtual machine image that has just about everything installed you could want, including demo content.

The "All-up BI VPC" has two parts, available at:

Part One (3.4 GB)

Part Two (1.0 GB)

Yes - *giga*bytes. These are compressed, and will unzip to a vhd file for Virtual PC that's over 20GB in size. To run the virtual PC, I highly recommend at least 1GB free memory (so your PC needs at least 2GB). If you can free up more, that's better. Also try to work on a PC with two hard drives, and put this on the non-system drive.

Philo's WebLog : All-up BI VPC

Then do some reverse-engineering of queries to see what is going on.

Step 1 - SQL Profiler - Analysis Services
I noticed that the AdventureWorks_Mfg Analysis Services database uses Proactive Caching.  Disabling this will help a tad, if you're not using it.  You can do this by browsing to the cube in Analysis Services manager and right-clicking on the measure group partition.

Step 2 - SQL Profiler - SQL Server
I noticed that the bcp_BizSystemGetAllApps script is called all the time, and having it return the b.OriginalBMO xml column is a big drag on performance.  I commented the column out of the proc's query with no ill effects (just yet... but beware.)

Step 3 - Add indexes
Add an XML index to the BizSystem table OriginalBMO column in the application database and watch the modeler fly.  Actually, add XML indexes to all XML columns.  Either it's in my head or this improved things dramatically.  No more remote scans...

select * from sys.objects o inner join sys.columns c on c.object_id = o.object_id where c.user_type_id = 241 will get you a list.  There are some in the application database and some in the PPSPlanningSystem database.

Note that any of these changes could break PerformancePoint.  Try at your own risk!

Step 4 - SQL Analysis Services - Under the properties of the bi-vpc, turn off the flight recorder.

Step 5 - SQL Server - Take offline any databases you are not using.  So long ProjectReal, you fake project!

Step 6 - Services - turn off Automatic Updates.  Disk Defragmenter. Office Sharepoint Server Search.  Office Document Conversions services, Windows Sharepoint Services Search.

Monday, July 28, 2008

Peter Eb. : HOWTO: Make 1 filter control more than 1 matrix

 

A common request by form authors is to create a "start here" worksheet page where contributors can change some global filters. Then they advance to other worksheets in the workbook to do their actual data entry. Other times authors just want to link multiple matrices so that changing the filter on matrix A also changes matrix B at the same time. The way RDL is structured makes it natural for PPS to also define all filters as global filters: the add-in stores them naturally as ReportParameters in RDL which are global. Then each matrix is subscribed to the filter.

Peter Eb. : HOWTO: Make 1 filter control more than 1 matrix

PerformancePoint - Where did my applications go?

If you're not seeing any applications when connecting from the modeler, and perhaps you ran some DB upgrade scripts, chances are your applications are offline.

This can be changed by going to http://localhost:46788/Applications.aspx , connecting, clicking on applications and selecting each app to Take Online.

Friday, July 25, 2008

Norm's PerformancePoint Server Blog

An interface would be good for stuff like this. 

Two approaches exist for converting to label-based tables in preparation for populating PerformancePoint Server measure group tables.

  • Use PPS stored procs - more of a production solution
  • Use a join with source and staging and pull id's.

Here are the steps from the data integration help.

To prepare and load measure group data that have labels
  1. Create a label-based table

  2. Populate label-based measure group tables with data from the source database or system. This is a manual step.

  3. Validate label-based measure group tables and correct any errors by using the bsp_DI_ValidateLabelTable stored procedure.

  4. Repeat step 3 as required until all label-based data is valid.

  5. Convert a label-based measure group table to a MemberID-based table

About performing measure group validation in a model

You can use one of the following methods to validate measure groups before you load data from the staging database to the application database.

Note

If you have questions about the process, don't forget to use the PerformancePoint Server forum. You can also send me a message if you are concerned about if a question is phrased correctly -I'm a writer.

Norm's PerformancePoint Server Blog

Norm's PerformancePoint Server Blog : New Planning Data Migration Tool !

 

PerformancePoint Server 2007 Planning Data Migration Tool

There are two primary steps to a Migration, with many tedious “sub-steps”. 

The First step in the process is to Migrate the Application, this part of the process migrates:

  • Security roles
  • Structural metadata: all model site, model, dimension, and member set definitions
  • Data Sources and Data Destinations
  • Calculation rules and any associated job templates
  • Form templates
  • Calendar

The Second, (more tedious), step is to Migrate the Data, which accomplishes migrating the following:

  • Dimension and hierarchy data
  • Fact data and annotations
  • Associations

The “PerformancePoint Server 2007 Planning Data Migration Tool” automates the Second set of steps, the “Data Migration” portion.  The full manual process, (which requires the user to have intricate knowledge of the application), is documented in the TechNet Documentation located here:

  1. Preparing the Staging DB on the Source Server: http://technet.microsoft.com/en-us/library/bb838747.aspx
  2. Loading the Migrated Application on the Target Server: http://technet.microsoft.com/en-us/library/bb838692.aspx

Norm's PerformancePoint Server Blog : New Planning Data Migration Tool !

Tuesday, July 22, 2008

White paper: Performance tuning for PerformancePoint Planning Server

 

White paper: Performance tuning for PerformancePoint Planning Server

This white paper is for analysts and IT professionals who want guidance about PerformancePoint Planning Server scalability capabilities and the number of users that can be served. Considerations for hardware planning and software tuning are also covered.

Download this white paper as a Microsoft Office Word document (.doc).

Performance tuning for PerformancePoint Planning Server (http://go.microsoft.com/fwlink/?LinkID=100209&clcid=0x409)

White paper: Performance tuning for PerformancePoint Planning Server

Downloadable implementation plans

 

Downloadable implementation plans

This guide contains detailed information about how to plan a deployment of Microsoft Office PerformancePoint Server 2007.

The PerformancePoint Server Implementation Guide consists of 12 Microsoft Office documents (templates, projects, and worksheets) that can help you plan your implementation of PerformancePoint Server. Start with the Project Plan Framework presentation for a detailed introduction to the tasks involved in implementing PerformancePoint and for guidance on how to most effectively use the templates. Included in the guide are templates for configuration planning, training, requirements gathering, and a plan for a personalized operations and maintenance plan.

Downloadable implementation plans

Nick Barclay: BI-Lingual: The PPS Monitor & Analyze UDF Project (MAUDF)

Here is a useful utility to pull the tough-to-get-at PerformancePoint M&A repository data out.

Reuse & Recycle.

The PPS Monitor & Analyze UDF Project (MAUDF)

I've often wanted to be able access metadata within a monitoring server with greater ease. Querying the PPSMonitoring database is difficult because the more interesting pieces of information pertaining to a particular element are locked away within the SerializedXML column of the FCObjects table. Sure we can access the data in the SerializedXML column with XPath / XQuery but I really wanted to get stuck into some CLR stuff and figured UDFs would provide the greatest reuse, flexibility and query-ability.

I've created a series of CLR table-valued UDFs and created a home for it on Codeplex, a project that I have named MAUDF (taking a bit of inspiration from the ASSP project). Using the CLR the MAUDFs connect to the PPS Monitoring web service and extract pertinent metadata and return the results in a tabular format.

Nick Barclay: BI-Lingual: The PPS Monitor & Analyze UDF Project (MAUDF)

Radoslaw Lebkowski's Blog: Updating PerformancePoint Server 2007 with SP1 – step by step instructions

Radowslaw posted some good links to help with upgrading to SP1 for PerformancePoint – with one from a co-worker of mine.

After launching SP1 for PerformancePoint Server 2007, on many internet forums about PPS, I’ve found many posts describing errors while installing SP1.
For example, one of them described problems with planning applications, which disappeared in Business Modeler after upgrading Planning Server with SP1.
Installing SP1, specially for Planning Server, it is not only running SP1 installation files, but many other important steps.
So, if you omit some of them, you can receive different errors like this above.
If you are going to upgrade your PerformancePoint Server, I strongly recommend to review the following articles, before you start this procedure:
Detailed instructions on Microsoft Technet:
Updating PerformancePoint 2007 Planning Server with Service Pack 1
Updating PerformancePoint 2007 Monitoring Server with Service Pack 1
And if you need another step by step instruction with many screenshots included, read the following article:
http://performancepoint07.blogspot.com/2008/06/installing-performancepoint-planning.html

Radoslaw Lebkowski's Blog: Updating PerformancePoint Server 2007 with SP1 – step by step instructions

Thursday, July 3, 2008

The Microsoft ProClarity Team Blog : Totaling in ProClarity

 

I've written a paper on Totaling in ProClarity.  It's certainly not comprehensive but I do hope it answers some common questions regarding totaling.  Please contact us with any feedback or questions.

The Microsoft ProClarity Team Blog : Totaling in ProClarity