MSBI (SSIS/SSRS/SSAS) Online Training

Friday, November 19, 2010

Future MSBI-Power Pivot ‘N’ DAX




Here are Vidas comments on Power Pivot and DAX
- from the session ”SQLCAT: A Preview of PowerPivot Best Practices”. Presenters are Denny Lee and Dave Wickert:
• #PowerPivot spreadsheet is stored in the SQL database when deployed to SharePoint
• #powerPivot uses AS engine – in memory column based store w/VertiPaq. All calculations are local after import.
• Excel has it’s own local in process SSAS engine. New#powerPivot functions – Data Access eXpressions (DAX)
• #powerPivot compression 10:1 (approx) and depends…
• More info about #PowerPivot:VertiPaq does compression about 10:1, then SSAS does on top compression 1.5:1 or 2:1.
• MOLAP stores multiple records in the grain of fact table, so some records could be rolled up
• Vertipaq has a row for each source data, so different from MOLAP.
• There is on disk structure for #PowerPivot that looks similar to SSAS. c:\local user\… Additional info: Denny Lee just blogged about this with more details.
• Sub-folder customdata has file with backup of SSAS database for #PowerPivot.
• Showing demo of silverlight gallery of #PowerPivot reports in SharePoint
• #PowerPivot - when open workbook in Excel services, no interaction with SSAS first
• when click on slicer, excel services /#PowerPivot service talks to AS service engine. So on demand loading of SSAS DB.
• first initialization could take time – depends on traffic, etc. Same published #PowerPivot can serve many people.
• Need to optimize SharePoint for #PowerPivot setup.
• Capacity planning-need memory for any in use database and buffer 10-20% for auto detection(?).
• #PowerPivot keeps cache of detached databases. in the SSAS backup folder
• SSAS servers will be 64-128GB and more for #PowerPivot. Will work with 8GB, but most likely you will want more.
• #PowerPivot capacity planning depends - #users total/concurrent, # of files, max file size, data refresh options
• in SharePoint content db workbooks are stored as blobs . Could use Remote Blob store in FileStream. Overhead 2-3%
• but better handle higher concurrency scenarios (cont..)
• different topology options – one sharepoint server, multiple servers with different services on them, etc…
• Currently SharePoint requires Kerberos security all the way
• in SharePoint 2010 – new “claims tokens”, so no need for Kerberos for #PowerPivot
• Kerberos still might be required, but there is reduction in cases when you need it for #PowerPivot and SharePoint 2010.
• #PowerPivot - before installing YOU MUST READ MANUAL! That is important.
• upgrade SharePOInt 2007 to 2010 is very complex for#PowerPivot. Cannot upgrade CTP2->CTP3. Not clear CTP3->RTM.
• SharePoint is optimized for download, so there are concerns for upload #PowerPivot (large files)
• power pivot max size 2GB (SharePoint limit). But you can create >2GB #PowerPivot files, just not upload to Sharepoint. Additional info: Actual max publish size could be 1.8GB. Although it is limitation, it will affect very few users, as majority of them will not have such large files, so there is no need to worry about this
• in sharepoint LargeChunkFileSize parameter is useless – don’t touch it for #PowerPivot.
• to upload #PowerPivot to SharePoint you need very good network connection! Maybe first copy to SharePoint server – faster.
• troubleshooting – error messages very en-cryptic for#PowerPivot.
• use ULS logs (Bing or google for more info) from SharePoint to troubleshoot#PowerPivot.
• from ULS logs use correlationID to track down log data for one event. Log files very large, filter by time too. #powerpivot
there is no separate #PowerPivot log. Error could be in#PowerPivot, Excel services, etc., so ULS one option.
• You can use SQL Profiler to troubleshoot#PowerPivot SSAS instance! Good news!
• so to troubleshoot #PowerPivot you need to know SharePoint and SSAS and use tools! Consultant will be still busy…
• session almost done, time for lunch. See you latter tonight for #PowerPivot DAX session!
• create a view with fewer records and create#PowerPivot, deploy to Sharepoint, then update view to include all records. Faster.
• #PowerPivot tip: Rename *.xlsx file to *.zip and inside you will see data file that you can rename to *.abf and restore to SSAS server that runs in PowerPivot integrated mode
• this was the best session so far at - Best Practice on#PowerPivot by @dennylee and Dave Wickert
- from the session “DAX in #PowerPivot for Excel 2010″ by Howie Dickerman.
• Data Analysis Expressions = DAX.
• DAX lets user do multidimensional analysis without user knowing that this is multidimensional analysis.
• sample DAX: =[Qty]*[Price] – syntax just like Excel.
• DAX uses Excel functions, but no notion of addressing indv cells or ranges – instead columns in the data
• DAX is not replacement for MDX
• DAX provides functions that implement relations database concepts: Filter tables, aggregates, follow relationships
• DAX one to many function sample: =SUMX(RELATEDTABLE([Sales], Sales[Amount])
• DAX has functions to assist with dynamic aggregations of measures:
• DAX dynamic sample: =VALUES(Time[year]) & “.” & VALUES(Product[ProductID])
• DAX sample: =IF(VALUES(Time[Year])=2008,”Baseline”,”normal”)
• DAX calc expression in 2 places – calc columns (full materialize) and measures (eval for each cell dynamically)
• More than 80 Excel functions in DAX
• CTP3 has new FORMAT function to allow to convert any number to string #DAX
• DAX sample: [salesAmt]/[SalesAmt](All(Product)) – use measure as function.
• DAX = Sales[SalesAmt]/CALCULATE(Sales[SalesAmt], ALL(Product)) more complex syntax
• CTP3 DAX has 35 Time Intelligence Functions – require date column in the data. Column type Date
• DAX this version does not have custom time periods and weeks. Works just with Yr, Qtr,Mth,Day
• DAX function samples: FirstDate, LastDate, FirstNonBlank, StartOfMonth, StartOfQtr, EndOfYear
• DAX: DateAdd, DatesBetween, DatesInperiod – 26 function that return a table of dates
• DAX: PreviousDay, Nextmonth, DatesMTD, DatesYTD, TotalMTD, OpeningBalanceQuarter, ClosingBalanceYear, etc <-more functions
• Year over year growth: =Sales[SalesAmtt]-Sales[SalesAmth)(DateAdd(Time[Date],-1,Year].All(Time))
• AllTime – need to add now in CTP3, not in RTM(will be under cover). So that first year has data.
• DAX sample: QTD Sales =TotalQTD(Sales[SalesAmt],Time[Date], All(Time))
• DAX yearAgo:=Sales[SalesAmt)(ParallelPeriod(Time[Date],-12, Month).All(Time))
• I missed single quotes in same functions prev. If table name contains spaces, need to put single quote around names.
• DAX autocomplete in CTP3 add single quotes, but they are optional.
More Details:
http://www.ssas-info.com/VidasMatelisBlog/195_my-tweets-from-sql-pass-summit-2009-ssas-powerpivot-dax-and-more
http://ms-olap.blogspot.com/2010/05/self-service-bi-powerpivot-and-future.html#comment-form
http://cwebbbi.wordpress.com/2010/11/11/pass-summit-day-2/
~Rav’s

Wednesday, November 3, 2010

My latest Pics-Smoky mountain,tennesse











SSRS Report Deployment Procedure in Sharepoint server

Hi Guys,
Once we develop the SSRS report,we have to deploy this report into sharepoint portal to provide the facility to access these reports by power users /CEO’s.
Here I explained detailed deployment procedure,Please have a look.
Step-1:SSRS Reports Deploy at Report Server
1. Open the SSRS Report Manager Production URL
https://servername.com/reports

2. Go to the Home Folder in Report Manager.

3. Go to Home/DataSources folder

4. Click on “New Data Source”. Make sure the settings in the “Properties/General” tab match with the settings like: Name,Connection Type,Connection String and report server credentials.

5. At the very bottom of the page, click on “Apply”

6. Go to the Home/ Administration folder and for each of the report
o Click “Upload File” button
o Navigate to Reports location folder
o Select each report RDL file from the table below
o à click “OK”

7. Click on “Properties” > “Data Sources” and then “Browse” button

8. Expand the “Data Sources” folder and then select the Data Source “XXXX”

9. Click “OK” and then “Apply”

All the required reports should have been successfully migrated to server(Test/Prod).

Step-2: Changes at Report Viewer Level
In the Report Viewer folder, Edit the XXXMenu.xml file.
Under the Administration Reports, Add the new tag.





Step-3: Changes at sharepoint portal
Copy the link from the report server for the report and add them to the SharepointPortal Source editor HTML .
1. Goto the portal page where you want to show this report and select the drop down button in that section and choose option ‘Modify Shared WebPart’ option ans selct the Source Editor button from Content Editor window.
2. Copy the report url from Report Server and place into the Portal Content Editor Window.Place the below code inside of that…

href="https://servername.com/ReportServer/Default.aspx?%2fReports%2fAdmin+Sales+ Report&rs:Command=Render" target="_blank">Sales Report


That’s it.Now User is able to access the report from sharepoint portal.

We have different methods to deploy the reports.
See the below blogs to know some other deployment techniques.

http://blah.winsmarts.com/2007-12-SharePoint_and_SQL_Server_Reporting_Services_-_Authoring_a_simple_report.aspx

http://bisqlserver.rdacorp.com/2008/08/deploy-ssrs-reports-in-sharepoint.html


~Rav’s

SSRS Report Deplyment Procedure in Sharepoint server

Monday, November 1, 2010

SSAS-Data Security

Hi,
SSAS provides the way to secure analysis services database/cube data from unauthorized access. Analysis services provides secure access by creating object called "roles". After creation of role, user's windows login credential can be used to enroll into particular role because analysis services identifies user from their windows login credentials . You can protect your data in roles at two levels:
1) Dimension level
2) Cell level
If user has been assigned more than one role, analysis services loop through all assigned roles after login. Analysis services finds all permission level for the particular user and union all the permission levels.
If two roles has contradictory access for user then particular access will be allowed. Suppose role1 says Australia data access and role2 denies Australia data access then access to Australia data will be allowed.

http://learnmicrosoftbi.blogspot.com/2010/10/ssas-data-security-dimension-and-cell.html

SSAS-Data Security

Hi,
SSAS provides the way to secure analysis services database/cube data from unauthorized access. Analysis services provides secure access by creating object called "roles". After creation of role, user's windows login credential can be used to enroll into particular role because analysis services identifies user from their windows login credentials . You can protect your data in roles at two levels:
1) Dimension level
2) Cell level
If user has been assigned more than one role, analysis services loop through all assigned roles after login. Analysis services finds all permission level for the particular user and union all the permission levels.
If two roles has contradictory access for user then particular access will be allowed. Suppose role1 says Australia data access and role2 denies Australia data access then access to Australia data will be allowed.

http://learnmicrosoftbi.blogspot.com/2010/10/ssas-data-security-dimension-and-cell.html

**SSAS-Interview Questions**

Nice Post.!!Thanks Amit!!
http://learnmicrosoftbi.blogspot.com/search/label/Interview%20Questions

SSAS-MDX functions

Hi Guys,
Please look into the below site for more MDX functions.
Regards,
Rav’s
http://www.ssas-info.com/analysis-services-articles/50-mdx/2518-mdx-tutorials-mdx-queries-mdx-samples

SQL SERVER – Merge Operations – Insert, Update, Delete in Single Execution

Hi, MERGE is a new feature that provides an efficient way to do multiple DML operations. In earlier versions of SQL Server, we had to write separate statements to INSERT, UPDATE, or DELETE data based on certain conditions; however, at present, by using the MERGE statement, we can include the logic of such data changes in one statement that even checks when the data is matched and then just update it, and similarly, when the data is unmatched, it is inserted.
One of the most important advantages of MERGE statement is that the entire data are read and processed only once. In earlier versions, three different statements had to be written to process three different activities (INSERT, UPDATE or DELETE); however, by using MERGE statement, all the update activities can be done in one pass of database table.
See the below blog for more info.Nice article from Dave.
http://blog.sqlauthority.com/2010/06/08/sql-server-merge-operations-insert-update-delete-in-single-execution/

SSAS-MDX functions