MSBI (SSIS/SSRS/SSAS) Online Training

Thursday, October 17, 2013

SSIS - Slowly Changing Dimension Type 1 Implementaion with MergeJoin Vs Lookup Vs SCD Transformation

Hi All,
I would like to differentiate the SCD implementation design patterns with different design advantages/disadvantages in terms of performance. Enjoy !!

I.                    Slowly Changing Dimension Type 1 with Merge Join.
Right from the start the design is a bit different. Instead of adding your source query to an OLEDB Source component and then connecting it to the lookup or the SCD component, we create 2 OLEDB Source components. 
The first one is the same as in the previous two posts, it connects to our source table (UserUpdate).  However we have to modify the query a little bit to get it to work with the Merge Join component.  So lets take a look at that query.
SELECT
    [UserAK]
    ,[ScreenName]
    ,[AccountCreateDate]
    ,[TimeZone]
    ,[Language]
    ,[GeographySK]
    ,CHECKSUM( 
        [ScreenName]
        ,[AccountCreateDate]
        ,[TimeZone]
        ,[Language]
        ,[GeographySK] 
    ) as Hash_CD
FROM
    DimUserUpdate
Order By UserAK

The addition of the ORDER BY clause is essential when using the Merge Join design pattern.

The second OLEDB Source component will query against our destination table, essentially replacing the lookup  component from the Lookup Conditional Split design pattern.  As in the lookup query we only need to bring back the UserAK, UserSK and the Hash_CD.  But just like the source query above we need to add the ORDER BY clause because we are going to use the Merge Join.

SELECT [UserSK]
      ,[UserAK]
      ,[Hash_CD]
  FROM [Demo].[dbo].[DimUser]
Order By UserAK 
Simply using the ORDER BY clause is not enough for SSIS to know that the datasets you are intending to enter into the Merge Join is not enough. You have to tell SSIS that the data is sorted, and on what column the sort is happening.  I agree it’s a little redundant, but that’s how it is.  You must do this for each OLEDB Source.  To do this right click on the OLEDB Source and select Show Advanced Editor from the menu.
Once inside the advanced editor select the Input and Output Properties Tab along the top of the window. Then under the Common Properties for the OLEDB Source Output change the IsSorted property to True.
That takes care of telling SSIS that the data is sorted, but now we need to tell SSIS what column it is sorted on.  To do that we drill down OLEDB Source Output to the Output Columns and select the column, UserAK, (or columns if your query is ordered by more than one column) and change its SortKeyPosition to 1
Now do this for the second OLEDB source and then we’ll be ready for the Merge Join component.  Please note that if you have multiple column in your order by clause they must match in both queries, and you must set the SortKeyPosition in the same order for both components.  Once the the source components are configured drag in the Merge Join component.
Now drag the first source component output connection to the Merge Join component. Once you’ve connected this a selection box will appear.  It will ask you to tell it what side of the join this output belongs on.  This is important, because we are basically going to be creating a Left Outer Join inside the component, so knowing which side of the join each portion belongs to is obviously essential.
When you add the second output from the other OLEDB Source you won’t be prompted again, it will just make it the input for the other side of the join. Now double click on the Merge Join component and open the editor. 
The first property to set is the Join Type.  Your options are Left Outer Join, Inner Join, or Full Outer Join.  We are going to be using the Left Outer Join option.  Below the Join Type is the columns from our two queries each on the side we specified when hooking up the component to the Merge Join.  The keys are already linked now we need to add the columns we want to return from both sides of the join.  Make sure to both the Hash_CD columns to make comparing them easier later on in the package.
The next step will be to add a conditional split to the package to determine which rows are new rows to be inserted and which rows need to be evaluated for updates.  To do this requires only one output to be created and we’ll call it NewRecords.  Here is the expression you need to add to the editor.
If this doesn’t make immediate sense let me explain.  Since we did a left outer join if there were no matches between the two queries on the UserAK it means that the records without a match are new records and as a result the UserSK would evaluate to NULL.  Now hook up the NewRecords output to your OLEDB Destination, set the connection to the destination table, confirm the mappings and the new records will get inserted correctly.
Now on to the second Conditional Split. Here we need to evaluate if the records that matched on UserAK have had a change in any of the columns that we are tracking changes on.  To do this we will use the Checksum values that we’ve created in our source query.  We need to write 2 SSIS expressions, 1 for each output, to determine if the records have changed.  Here are the expressions:
We then connect the Update output to the OLEDB Command component to update the rows that need to be updated.
Inside the OLEDB Command editor the first tab you are shown is the connection manager tab. Simply connect to the destination database and then select the component properties tab.
At the bottom of this tab, under Custom Properties is the SQLCommand property. Here you will write your update statement. To do this you will have to map all of the columns to be updated in the destination to the columns coming through in the data flow. The query will look like you are mapping them to parameter values (?). Notice that even the where clause is set using the parameter indicator.
UPDATE DimUser
 SET
     [ScreenName] = ?
    ,[AccountCreateDate] = ?
    ,[TimeZone] = ?
    ,[Language] = ?
    ,[GeographySK] = ? WHERE [UserSK] =?
Once the query is complete go to the Column Mapping tab.
No it is just a matter of correctly mapping the parameter values to the columns in your data flow (input columns). Make sure to pay attention to your column order in the update query to map the parameter values to the appropriate input column. Remember that the last parameter value is for the where clause and it is why we brought the UserSK value from the Lookup query to begin with.
When your done the Data flow should look something like this.
I hope this helps some of you looking to try different ways to update SCD1 Dimensions.

II. Slowly Changing Dimension Type 1 with Lookup and Conditional Split
In below post I talked about using the SCD component that comes with SSIS to load a Dimension.  This method is ok for loading small tables (<50 span=""> The Lookup and Conditional Split design pattern performs much better.  The main reason for the enhanced performance is the lookup component.  The lookup component executes its query once and stores the results in cache where as the SCD component queries the destination table for each row that comes from the source.  While there is no easy to configure wizard to setting up this design pattern, it isn’t too difficult to do manually.
With the SCD component we didn’t have to do any additional work in our source query to make comparing the columns quick and easy.  With the Lookup and Conditional Split we need to add a checksum to our query.  Here is the query that includes the checksum:
SELECT
    [UserAK]
    ,[ScreenName]
    ,[AccountCreateDate]
    ,[TimeZone]
    ,[Language]
    ,[GeographySK]
    ,CHECKSUM( 
        [ScreenName]
        ,[AccountCreateDate]
        ,[TimeZone]
        ,[Language]
        ,[GeographySK] 
    ) as Hash_CD
FROM
    DimUserUpdate
Now that we have our source query configured correctly and added the Lookup component and connected the two, lets configure the Lookup.
On the General Tab you can configure the Cache mode, the connection type and the way to handle no matches.  This third setting is very important when configuring the lookup for Slowly Changing Dimensions.  You want to set the option to Redirect Rows to No Match output.  This allows you to insert rows that don’t already exist in you destination table.
On the Connection Tab you obviously set the connection to your database and can either choose to select a table or use a SQL query.  I suggest you always use a SQL query and bring back only the rows you need for the lookup.  Doing this will save space in memory, and if you have a very wide table with millions of rows this could cause your lookup to run slowly or even fail.  In this query I am only bringing back the UserAK (business key) UserSK(surrogate key) and the Checksum value.  With regards to the checksum it is entirely up to you (or the data architect if that isn’t you as well) on whether or not you store the checksum value.  In this example I am not storing the value.  Here is the query:
SELECT
    UserSK
    ,UserAK
    ,CHECKSUM( 
        [ScreenName]
        ,[AccountCreateDate]
        ,[TimeZone]
        ,[Language]
        ,[GeographySK] 
    ) as Hash_CD
FROM dbo.DimUser 
The Columns Tab is pretty easy to configure.  The two boxes on the upper half of the editor window represent the source query (left) and the lookup query (right).  To configure just drag the business key (UserAK) from the source query on to the busness key (UserAK) from the lookup query.  Then put a check mark next to the columns from the Lookup that you want to bring back into the dataflow (UserSK and Hash_CD).  Since the column Hash_CD exists in both the lookup and the source queries make sure to set the Output Alias (LKP_Hash_CD) so it is easy to differentiate between the two.  This is all you need to do to configure the lookup.
From the lookup we should have to outputs:
  1. Lookup Match Output
  2. Lookup No Match Output
The Lookup No Match Output will contain all of our new records.  We can map this output directly to our OLEDB Destination with no further work. 
The Lookup Match Output contains all the records that had matches in the destination table, so the next step will be to determine if the records coming are different from the records that already exists.  To do this we will use the Conditional Split transform.
To configure this transform we will use the two Hash_CD values two create two different outputs from the Conditional Split.  Fist we will configure the NoChange Outupt.  First name the output and then add the SSIS expression to compare the two values.  In this instance we want to send all the matching Hash_CD values to this output.  Here is the expression :
LKP_Hash_CD == Hash_CD
The next output will be the Change Output, and will contain all the records where the Hash_CD values didn’t match. Here is that expression:
LKP_Hash_CD != Hash_CD
That is it for the Conditional Split configuration. Now we need to set up the package to perform the updates.  There are a couple of methods to do this, but we are going to use the OLEDB Command to do the updates.

As you can see we use the Change Output to connect to our OLEDB Command component. 

Inside the OLEDB Command editor the first tab you are shown is the connection manager tab. Simply connect to the destination database and then select the component properties tab.
At the bottom of this tab, under Custom Properties is the SQLCommand property.  Here you will write your update statement.  To do this you will have to map all of the columns to be updated in the destination to the columns coming through in the data flow.  The query will look like you are mapping them to parameter values (?).  Notice that even the where clause is set using the parameter indicator.
UPDATE DimUser
 SET
     [ScreenName] = ?
    ,[AccountCreateDate] = ?
    ,[TimeZone] = ?
    ,[Language] = ?
    ,[GeographySK] = ?
 WHERE [UserSK] =?
Once the query is complete go to the Column Mapping tab.
No it is just a matter of correctly mapping the parameter values to the columns in your data flow (input columns).  Make sure to pay attention to your column order in the update query to map the parameter values to the appropriate input column. Remember that the last parameter value is for the where clause and it is why we brought the UserSK value from the Lookup query to begin with. 
Once all the mapping is done click OK and you are now ready to handle Type 1 changes in your Slowly Changing Dimension.











III. Slowly Changing Dimension Type 1 Changes using SSIS SCD Component
The Slowly Changing Dimension Component included with SSIS is one of the methods you can use to manage slowly changing dimensions, and its pretty easy to use.  The problem with the SCD component is that it performs pretty badly, especially as the number of rows in your table grows.  I would say that any Dimension with over 50,000 records in it would be too big for this component.  For small dimensions it will work just fine.
Once you have configured your source component and placed the SCD component in you design pane and hooked the two components up, double click on the SCD component to open the editor.
The first step is to connect to your destination table.  Then you need to select the column or columns in the destination table that match the key or keys from your source table.  In this instance we are choosing the UserAK column which is the Primary Key in our source Table and the Alternate Key in the destination table.  Once the Key columns are selected click NEXT.
On this screen you tell the wizard which of you non key columns are you going to update.  You do this by selecting one of the options from the dropdown menu under the Change Type column next to each Dimension Column.  There are three options:
  1. Fixed Attributes, which means that the data in these columns won’t ever change, even if a change comes through from the source. 
  2. Changing Attributes which corresponds to a Type 1 change.
  3. Historical Attributes which corresponds to a Type 2 change.
Since that we are only worried about Type 1 changes we are going to select the Changing Attribute option.  Once all the columns are configured as you would like them click next.
Here we will configure how to handle the Fixed and Changing attributes. The first options determines how we want to handle updates that come for fixed attributes, either fail the transform or not. The next options allows you to update columns in historical records as well as the current record that are changing attributes. Once done here click next.
On the next screen you configure how to handle inferred members, for our purposes we will just leave this option disabled. Click Next and Finish.  The wizard will now add both an OLEDB Destination component and OLEDB Command component to handle the inserts and the updates and configure them for you.
Please don’t let the relative ease of setting up the SCD component drive your decision to use it.  If you expect your dimension to get fairly large there are other design patterns that you can use that will produce much better results.


Wednesday, September 18, 2013

SSRS Interview Questions and Answers



1. How do u implement Cascading parameter?
The list of values for one parameter depends on the value chosen in preceding parameter.
Eg: Country -->  State --> City
2. How to pass parameter from Report Viewer Control to sub report?

3. How to open another report in a new window from existing report?
Use a little javascript with a customized URL in the "Jump to URL" option of the Navigation tab.
Non-parameterized Solution
To get started, let's pop up a simple non parameterized report. Follow these instructions:
1.                 Instead of using the "Jump to Report" option on the Navigation tab, use the "Jump to URL" option.
2.                 Open the expression screen (Fx button).
3.                 Enter the following:
=
"javascript:void(window.open('http://servername?%2freportserver%2fpathto%2freport&rs:Command=Render'))"
4.                 Click OK twice, then save and deploy the report.
Parameterized Solution
Assume you have a field called ProductCode. Normally, you might hard code that like this:
http://servername/reportserver?%2fpathto%2freport&rs:Command=Render&ProductCode=123

In this case, you want to pass variables dynamically, using an available value from the source dataset. You can think of it like this:

http://servername/reportserver?%2fpathto%2freport&rs:Command=Render&ProductCode=Fields!ProductCode.Value

The exact syntax in the "Jump to URL" (Fx) expression window will be:

=
"javascript:void(window.open('http://servername/reportserver?%2fpathto%2freport&rs:Command=Render&ProductCode="+Fields!ProductCode.Value+"'))"

4. How to pass parameter from chart to Table in same report?

5. How to apply custom Colors of chart report?
STEP1:
Create your custome color palette in the report using Custom Code in your report. To do so, click Report => Report Properties => Code and copy below code:

Private colorPalette As String() = { "Blue", "Red", "Teal", "Gold", "Green","#A59D93", "#B8341B", "#352F26", "#F1E7D6", "#E16C56", "#CFBA9B"}
Private count As Integer = 0
Private mapping As New System.Collections.Hashtable()

Public Function GetColor(ByVal groupingValue As String) As String
If mapping.ContainsKey(groupingValue) Then
Return mapping(groupingValue)
End If
Dim c As String = colorPalette(count Mod colorPalette.Length)
count = count + 1
mapping.Add(groupingValue, c)
Return c
End Function

STEP2:
In the Pie Chart, select Series Properties and select the Fill option from left side.
Now write following expression in the Color expression:
=code.GetColor(Fields!Year.Value)

Here Fields!Year.Value is a field of dataset which is used as Chart Category fields.

6. Can we have Table within a Table in SSRS report?
Yes. We can have Nested Tables.
7. How to apply stylesheet to SSRS Reports
select  Report->Report Properties from the menu and then click the Code tab.

Function StyleElement (ByVal Element As String) As String
        Select Case UCase(Element)
            Case "TABLE_HEADER_BG"
                Return "Red"
            Case "TABLE_FOOTER_BG"
                Return "Green"
            Case "TABLE_HEADER_TEXT"
                Return "White"t;
            Case e Elsese
                Return "Black”k”
        End Select
End Function

Now apply this function to the style property of an element on the report.
=code.StyleElement("TABLE_HEADER_TEXT")  

If you want apply dynamic styles to report, then create tables in sql server and insert style information into the tables.
Create a Dataset, specify the Stored Procedure.
example: =Fields!TABLE_HEADER_TEXT.Value
where TABLE_HEADER_TEXT is a value in the table.

8. Dynamic sorting, Dynamic Grouping in SSRS
Can be done using expressions.
9. Different types of Filters
The 2 types of filters in SSRS are:
Dataset Filter:  Filtering within the source query.  When you implement a filter within the data set, less data is sent  from the source database server to the Report Server - usually a good thing.
Report Filter:  This includes filtering after the source query has come back – on a data region (like the Tablix), or a data grouping.  When you implement a filter within the report, when the report is re-executed again with different parameter choices, the Report Server uses cached data rather than returning to the database server.  
Using a Dataset Filter is the most efficient method.
10. Difference between Filter and Parameter? Which one is better?
In case of Filters, first the data will be fetched from the database, then the Filters are applied on the fetched data. Filters are applied at run time first on the dataset, and then on the data region, and then on the group, in top-down order for group hierarchies.
To add a filter, we must specify a filter equation (expression). The data type of filtered data and value must match.

Parameters are applied at the database level. The Data will be fetched based on parameters at the database level using WHERE condition in the query.

Parameters are better than Filters in performance.

11. Optimization of Report
Report can be optimized in terms of Grouping, Filters.
Report can be optimized through Caching, Snapshot and subscriptions.


1. The total time to generate a report (RDL) can be divided into 3 elements:
Time to retrieve the data (TimeDataRetrieval).
Time to process the report (TimeProcessing)
Time to render the report (TimeRendering)
Total time = (TimeDataRetrieval) + (TimeProcessing) + (TimeRendering)

These 3 performance components are logged every time for which a deployed report is executed. This information can be found in the table ExecutionLogStorage in the ReportServer database.

SELECT TOP 10 Itempath, parameters,
     TimeDataRetrieval + TimeProcessing + TimeRendering as [total time],
     TimeDataRetrieval, TimeProcessing, TimeRendering,
     ByteCount, [RowCount],Source, AdditionalInfo
FROM ExecutionLogStorage
ORDER BY Timestart DESC

2. Use the SQL Profiler to see which queries are executed when the report is generated. Sometimes you will see more queries being executed than you expected. Every dataset in the report will be executed. A lot of times new datasets are added during building of reports. Check if all datasets are still being used. For instance, datasets for available parameter values. Remove all datasets which are not used anymore.
3. Sometimes a dataset contains more columns than used in the Tablix\list. Use only required columns in the Dataset.

4. ORDER BY in the dataset differs from the ORDER BY in the Tablix\list. You need to decide where the data will be sorted. It can be done within SQL Server with an ORDER BY clause or in by the Reporting server engine. It is not useful to do it in both. If an index is available use the ORDER BY in your dataset.
5. Use the SQL Profiler to measure the performance of all datasets (Reads, CPU and Duration). Use the SQL Server Management Studio (SSMS) to analyze the execution plan of every dataset.

6. Avoid dataset with result sets with a lot of records like more than 1000 records.  A lot of times data is GROUPED in the report without an Drill down option. In that scenario do the group by already in your dataset. This will save a lot of data transfer to the SQL Server and it will save the reporting server engine to group the result set.
7. Rendering of the report can take a while if the result set is very big. Look very critical if such a big result set is necessary. If details are used in only 5 % of the situations, create another report to display the details. This will avoid the retrieval of all details in 95 % of the situations.
12. I have 'State' column in report, display the States in bold, whose State name starts with letter 'A' (eg: Andhra pradesh, Assam should be in bold)

13. In which scenario you used Matrix Report
Use a matrix to display aggregated data summaries, grouped in rows and columns, similar to a PivotTable or crosstab. The number of rows and columns for groups is determined by the number of unique values for each row and column groups.
14. Image control in SSRS
An image is a report item that contains a reference to an image that is stored on the report server, embedded within the report, or stored in a database.
Image Source : Embedded
Local report images are embedded in the report and then referenced. When you embed an image, Report Designer MIME-encodes the image and stores it as text in the report definition.
When to Use:
When image is embedded locally within the report.
When you are required to store all images within the report definition.
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEinQcXb68OU6TuTQ1UjRmt33IaEB_DeXUFivOcs8iR6wkRI3VhXNoqsK1MtYL0Taa8hWiwXRojvK_U_5rVlo_kbqM3ZyOaPil4eioqT_ogH223UFJI8nU35AJimRD3Oco9bc72q0wu0Ekc/s640/image_control.jpg




















Image Source : External

When you use an external image in a report, the image source is set to External and the value for the image is the URL to the image.
When to Use:
When images are stored in a File System, External File Share or Web Site.

Image Source : Database
If we add images that are stored in a database to report then such image is known as a data-bound image. Data-bound images can also be displayed from binary data (BLOB) stored in a database.
When to use:
When image is stored in a Database.
When you specify a dataset field that is bound to a database field that contains an image.

15. Role of Report Manager
Deploying the reports onto the web server.
Delivering the reports through E-mail or File Share using the subscriptions.
Creating the Cached and Snapshot Reports.
Providing the Security to the reports.
16. How to upload a report to report server
In the Report Manager, we have upload option to upload the reports.
17. What is a Shared Dataset
Shared datasets retrieve data from shared data sources that connect to external data sources. A shared dataset contains a query to provide a consistent set of data for multiple reports. The dataset query can include dataset parameters.

Shared datasets use only shared data sources, not embedded data sources.

To create a shared dataset, you must use an application that creates a shared dataset definition file (.rsd). You can use one of the following applications to create a shared dataset:
1. Report Builder: Use shared dataset design mode and save the shared dataset to a report server or SharePoint site.
2. Report Designer in BIDS: Create shared datasets under the Shared Dataset folder in Solution Explorer. To publish a shared dataset, deploy it to a report server or SharePoint site.

Upload a shared dataset definition (.rsd) file. You can upload a file to the report server or SharePoint site. On a SharePoint site, an uploaded file is not validated against the schema until the shared dataset is cached or used in a report.

The shared dataset definition includes a query, dataset parameters including default values, data options such as case sensitivity, and dataset filters.

18. How do u display the partial text in bold format in textbox in Report? (eg: FirstName LastName, where "FirstName" should in bold fornt and "LastName" should be in normal font.)
Use PlaceHolder
19. How to Keep Headers Visible When Scrolling Through a Report?
1. Right-click the row, column, or corner handle of a tablix data region, and then click Tablix Properties.
2. On the General tab, under Row Headers or Column Headers, select Header should remain visible while scrolling.
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgCgmHpskG31J1OfAC4rjRE92yf_lS9FBmS2zuUq852EgA-9D_iZvVAR0mvlNKeCZYvIxKH7uZcMtCJbcBjqFUAtENkr_6ojJAd3vebX5gkdRFW_JlCF_vGHPj83EwCnPt9_Yk45QvHj94/s400/1.jpg























3. Click OK.


To keep a static tablix member (row or column) visible while scrolling
1. On the design surface, click the row or column handle of the tablix data region to select it. The Grouping pane displays the row and column groups.
2. On the right side of the Grouping pane, click the down arrow, and then clickAdvanced Mode. The Row Groups pane displays the hierarchical static and dynamic members for the row groups hierarchy and the Column groups pane shows a similar display for the column groups hierarchy.
3. Click the static member (row or column) that you want to remain visible while scrolling. The Properties pane displays the Tablix Member properties.
4. In the Properties pane, set FixedData to True.


https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi9E5zcImlg0alUz9xgyG6Yled3gGru5QdUa8cX-VvjXsW3-gL5Ycz_ajLDb4Cx30aSBhyphenhyphengvFbqKT0e9Z-B_2XpKLbrILE8W65IIrnwg62XVohR2KGrVYvQGXQL7DAxGf6CKElobj7tcW4/s400/2.jpg
























20. How to add Page Break

1. On the design surface, right-click the corner handle of the data region and then click Tablix Properties.
2. On the General tab, under Page break options, select one of the following options:
Add a page break before:Select this option when you want to add a page break before the table.
Add a page break after:Select this option when you want to add a page break after the table.
Fit table on one page if possible:Select this option when you want the data to stay on one page.
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjpMyLZSzwr-DDwSIWyFod1Nfm_R_gntTUhjU_R0y5V7W-lcC_E9KH3n2Pr7QJTK2czJudQmlZPBLqDovHtSHYiTL6Z_0QfKHunif8Z6tmDXXkEfzDWl5ElJw1W4YrU_r8t1PDv_b1I8PA/s400/5.jpg























21. A main report contain subreport also. Can we export both main report and subreport to Excel?
Yes. The exported report contains both the mail report and sub report.
22. how to convert PDF report from Portrait to Landscape format?
In Report Properties -->
Set the width of the report to the landscape size of your A4 paper: 29.7 cm
Set the height of the report to 21 cm.

To avoid extra blank pages during export, the size of the body should be less or equal to the size of the report - margins.
Set the width of the body to 26.7 cm (29.7 -1.5 - 1.5)
Set the height of the body to 18 cm (21 - 1.5 -1.5)

23. Error handling in Report
Step 1: All the data sets of the report should contain one addition input parameter which should pass a unique information for every request (for every click of View Report button) made by the user.
Step 2: Need to implement TRY CATCH blocks for all the Stored procedures used in the SSRS reports through datasets. The CATCH section of every procedure should have the provision to save the error details into DB table, if any error occurred while execution of that procedure.
Step 3: Add one more additional dataset with the name "ErrorInfo" which should call the store procedure (USP_ERROR_INFO). This procedure should be accepting a unique value. This unique value should be passed to all the data sets for every click of 'View Report' button made by the user. This dataset will return the error information available in the data base table by verifying records with the unique id which has passes as input parameter.
Step 4:Enable the “Use Single Transaction When Processing Queries” option in data source properties, which makes all the query executions through a single transaction.
Step 5: After successful completion of all the above mentioned steps, insert new table on SSRS report with custom error information which will be shown to the report user if the user gets any error during execution of the report.

24. Have u worked on any 3rd party Report Tools
There are few third party Report Tools like Nevron, izenda.

25. Different ways of Deploying reports
1. We can deploy the reports using rs.exe tool
2. In the Solution Explorer,

2.1.Right-click the report project, and then click Properties.
2.2.In the Property Pages dialog box for the project, select a configuration to edit from the Configuration list. Common configurations are DebugLocal, Debug, and Release.
2.3.In StartItem, select a report to display in the preview window or in a browser window when the report project is run.
2.4.In the OverwriteDataSources list, select True to overwrite the shared data source on the server each time shared data sources are published, or select False to keep the data source on the server.
2.5.In the TargetDataSourceFolder text box, type the folder on the report server in which to place the published shared data sources. The default value for TargetDataSourceFolder is Data Sources. If you leave this value blank, the data sources will be published to the location specified in TargetReportFolder.
2.6. In the TargetReportFolder text box, type the folder on the report server in which to place the published reports. The default value for TargetReportFolder is the name of the report project.
2.7. In the TargetServerURL text box, type the URL of the target report server. Before you publish a report, you must set this property to a valid report server URL.

https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_nYYMA8HzWM7kg4qF4filATUYBqsNOostbBJNriYZVs0dzim9DunbRs_zUte548vqttCZ0Spzsr0pzPfn_Hzzt7Wvs8_QAD8sjKLsyI9fUTS97HQsOSl1Src-e0aFlOv3gTFmTB8ynwA/s640/deploy.jpg




3. There are 2 options for deploying the reports that you create with Report Builder 3.0:
1. Report Manager
2. SharePoint document library
26. Difference between Cached Report and Snapshot Report
Cached Report is a saved copy of processed report.

The first time a user clicks the link for a report configured to cache, the report execution process is similar to the on-demand process. The intermediate format is cached and stored in ReportServerTempDB Database until the cache expiry time.
If a user request a different set of parameter values for a cached report, then the report processor treats the requests as a new report executing on demand, but flags it as a second cached instance.

Report snapshot contains the Query and Layout information retrieved at specific point of time. It executes the query and produces the intermediate format. The intermediate format of the report has no expiration time like a cached instance, and is stored in ReportServer Database.
27. Subscription. Different types of Subscriptions?
Subscriptions are used to deliver the reports to either File Share or Email in response to Report Level or Server Level Schedule.
There are 2 types of subscriptions:
1. Standard Subscription: Static properties are set for Report Delivery.
2. Data Driven Subscription: Dynamic Runtime properties are set for Subscriptions

28. SSRS Architecture

29. How to deploy Reports from one server to other server

30. Different life cycles of Report
1.Report authoring:
This stage involves creation of reports that are published using the Report Definition language. RDL is an XML based industry standard for defining reports.
Report Designer is a full-featured report authoring tool that runs in Business Intelligence Development Studio and Report Builder.
2. Report management:
This involves managing the published reports as a part of the webservice. The reports are cached for consistency and performance. They can be executed whenever demanded or can be scheduled and executed.
In short Report Management includes:
- Organizing reports and data sources,
- Scheduling report execution and delivery
- Tracking reporting history.
3. Report delivery:
Reports can be delivered to the consumers either on their demand or based on an event. Then they can view them is a web-based format.
–Web based delivery via Report Manager web site
–Subscriptions allow for automated report delivery
–URL Access, Web Services and Report Viewer control
4.Report security:
It is important to protect reports as well as the report resources. Therefore, Reporting Services implement a flexible, role-based security model.
31. Different type of Reports
Linked report:A linked report is derived from an existing report and retains the original's report definition. A linked report always inherits report layout and data source properties of the original report. All other properties and settings can be different from those of the original report, including security, parameters, location, subscriptions, and schedules.
Snapshot reports: A report snapshot contains layout information and query results that were retrieved at a specific point in time. Report snapshots are processed on a schedule and then saved to a report server. 
Subreport: A subreport is a report that displays another report inside the body of a main report. The subreport can use different data sources than the main report. 
Cached reports: A cached report is a saved copy of a processed report. Cached reports are used to improve performance by reducing the number of processing requests to the report processor and by reducing the time required to retrieve large reports. They have a mandatory expiration period, usually in minutes.
Drill Down Report: Means navigate from the summary level to detail level in the same report.
Drill Through Report: Navigation from one report to another report.
Ad hoc reports:Ad Hoc reporting allows the end users to design and create reports on their own provided the data models.
3 components: Report Builder, Report Model and Model Designer
Use 'Model Designer' tool to design 'Report Models' and then use 'Report Model' tool to generate reports.
Report Builder
- Windows Winform application for End users to build ad-hoc reports with the help of Report models.
32. Explain the Report Model Steps.
1. Create the report model project
select "Report Model Project" in the Templates list
A report model project contains the definition of the data source (.ds file), the definition of a data source view (.dsv file), and the report model (.smdl file).
2. Define a data source for the report model
3. Define a data source view for the report model
A data source view is a logical data model based on one or more data sources.
SQL Reporting Services generates the report model from the data source view.
4. Define a report model
5. Publish a report model to report server.
33. How to get the data for Report Model Reports
Datasource View
34. Difference between RDL and RDLC?
RDL files are created for Sql Server Reporting Services and .RDLC files are for Visual Studio Report Viewer Component.

The element of RDL contains query or command and is used by the Report Server to connect to the datasources of the report.
The element is optional in RDLC file. This element is ignored by Report Viewer control because Report Viewer control does not perform any data processing in Local processing mode, but used data that the host application supplies.
35. Difference between Sorting and Interactive Sorting?
To control the Sort order of data in report, you must set the sort expression on the data region or group. The does not have control over sorting.

You can provide control to the user by adding Interactive Sort buttons to  toggle between ascending and descending order for rows in a table or for rows and columns in a matrix. The most common use of interactive sort is to add a sort button to every column header. The user can then choose which column to sort by.
36. What is Report Builder
Windows Winform application for End users to build ad-hoc reports with the help of Report models.
37. Difference between Table report and Matrix Report
A Table Report can have fixed number of columns and dynamic rows.
A Matrix Report has dynamic rows and dynamic columns.
38. When to use Table, Matrix and List
1. Use a Table to display detail data, organize the data in row groups, or both.
2. Use a matrix to display aggregated data summaries, grouped in rows and columns, similar to a PivotTable or crosstab. The number of rows and columns for groups is determined by the number of unique values for each row and column groups.
3. Use a list to create a free-form layout. You are not limited to a grid layout, but can place fields freely inside the list. You can use a list to design a form for displaying many dataset fields or as a container to display multiple data regions side by side for grouped data. For example, you can define a group for a list; add a table, chart, and image; and display values in table and graphic form for each group value
39. Report Server Configuration Files
1. RSReportServer.config:
Stores configuration settings for feature areas of the Report Server service: Report Manager, the Report Server Web service, and background processing.
2. RSSrvPolicy.config
    Stores the code access security policies for the server extensions.
3. RSMgrPolicy.config
   Stores the code access security policies for Report Manager.
4. Web.config for the Report Server Web service
   Includes only those settings that are required for ASP.NET.
5. ReportingServicesService.exe.config
6. Registry settings
7. Web.config for Report Manager
    Includes only those settings that are required for ASP.NET
8. RSReportDesigner.config
9. RSPreviewPolicy.config
40. Difference between a Report and adhoc Report
Ad Hoc reporting allows the end users to design and create reports on their own provided the data models.
Adhoc Report is created from existing report model using Report Builder.
41. How do u secure a Report
1. Authorization is provided through a role-based security model that is specific to Reporting Services.
Different Types of Roles provided by SSRS :
- Browsers
- Content Manager
- My Reports
- Publishers
- Report Builder
2. IIS security controls access to the report server virtual directory and Report Manager.

42.How to Combine Datasets in SSRS (1 Dataset gets data from Oracle and other dataset from Sql Server)
Using LookUP function, we can combine 2 datasets in SSRS.
In the following example, assume that a table is bound to a dataset that includes a field for the product identifier ProductID. A separate dataset called "Product" contains the corresponding product identifier ID and the product name Name.

=Lookup(Fields!ProductID.Value, Fields!ID.Value, Fields!Name.Value, "Product")

In the above expression, Lookup compares the value of ProductID to ID in each row of the dataset called "Product" and, when a match is found, returns the value of the Name field for that row.

43. Difference between Report Server and Report Manager
Report Server handle authentication, data processing, rendering and delivery operations.

The configuration settings of Report Manager and the Report Server Web service are stored in a single configuration file (rsreportserver.config).
Report Manager is the web-based application included with Reporting Services that handles all aspects of managing reports (deploying datasources and reports, caching a report, subscriptions, snapshot).
44. Steps to repeat Table Headers in SSRS 2008?
1. Select the table
2. At the bottom of the screen, select a dropdown arrow beside column groups. Enable "Advanced Mode" by clicking on it.
3. under Row Groups,select the static row and choose properties / press F4.
4. Set the following attributes for the static row or header row.
    Set RepeatOnNewPage= True for repeating headers
    Set KeepWithGroup= After
    Set FixedData=True for keeping the headers visible.
45. How to add assemblies in SSRS
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEju9q_jlOdwkWK5pFfKy_8FohPLPxIfRYV4wHds8eQvizPvaPB32P0lAMiSvfjT9ir8-cICGUHrewKPLE_Rp_csI1ZzsR-YJcNz3UHEqd0wE5iVKDp2OKpN22L7Ewcw8Q2VzhyphenhyphenCkmGi1pI/s400/6.jpg






















45. Report Extensions?

46. parent grouping, child grouping in SSRS

47. How to show "No Data Found" Message to end user?

Add a Text box with expression =IIF(Count(, "DataSet")=0,"No Data Returned", nothing)
and set the visibility of this Text box as =IIF(Count(, "DataSet")=0,False,True) 
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgxY44Vxe22U-Q2VQ4X6IgAfErYmE2tuDZAYQ_Xzx84RaSOSJ86bPI_mJkB8MrQg6EiYW_HORjL3Zov9J4fq-pYSSsE2IvjqG9vMOsU6dQ6crkDqpyKXBq3cMzB-XQDf1ygltiRjuOznAU/s400/nodatafound.jpg




48. What is the 'Use single transaction when processing the queries' in the Datasource? Dataset Execution Order?

By default, datasets are executed in parallel. 


This option used to reduce the amount of open connections to the database.  For example, if you have a report with 3 datasets and you don’t have this option checked, a new connection is made to the database for every single dataset.  However, if you have it checked, then only one connection will be open to the database and all the datasets will return the data and the connection will be closed.  This can be used to reduce network traffic and potentially increase performance.

Open the data source dialog in report designer, and select the "Use Single Transaction when processing the queries' check box.  Once selected, datasets that use the same data source are no longer executed in parallel.  They are also executed as a transaction, i.e. if any of the queries fails to execute, the entire transaction is rolled back.
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhAn5IXzLmnDqAP_BvpYg76JpxlhTaDtRYCdRy1vqN2V8bKP3K3ZWRWuF1zWVMZT9w4WBi7Y3LtYiQKqfbh2giG-U5NEhuFvxh7vLoSAV2un29D9WmDOFLL69-Xnv6UIyvEA_zUBqDLkCI/s400/use.jpg



The order of the dataset execution sequence is determined by the top-down order of the dataset appearance in the RDL file, which also corresponds to the order shown in report designer.
49. ReportServer and ReportServerTempDB Databases
ReportServer: hosts the report catalog and metadata.
For eg: keeps the catalog items in the Catalog table, the data source information in the Data-Source table of ReportServer Database.
ReportServerTempDB: used by RS for caching purposes.
For eg: once the report is executed, the Report Server saves a copy of the report in the ReportServerTempDB database.
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhOjSwBhSeSlS8YUnExDyLH0-osEYtlBuq2LKsoCmlO0bAD2zMAKlppLNoVlMgNsobCSZdF41vxxxBSMsGTzgN_MnPQ7XgaT-cuX-rYZyJZS9YokfiwavUchtAQMtMB432-RAvMftlZhJU/s400/ReportServer.jpg