Sunday, December 23, 2007

BI Master - Good or Bad Decision?!

Dear Friends,
I finished my first trimester of lessons… I still have one year more... and maybe you are curious to know if was a good or bad decision... and I sincerely say you that yes... it was the best decision I made in the last times. I feel that it was a good decision because I love business intelligence (as a set of small parts) and I found excellent colleagues and excellent teachers.



I am curious to know if the quality of the teachers will be the same in the next trimesters, but I will keep you informed! :-)

As I told you before, in this first trimester I had three classes:

1. Decision Suport Systems
2. Data Warehouse Projects
3. Data Analysis (Statistics)

Sincerely I like the three classes but my main preference goes to the first two! Data analysis I think is not difficult but you need to spend some time to study hard… in contrast, the other two classes you also need to study hard but in my case, is a way to improve my current daily job at my company as BI developer. So, I’m very motivated!

BI Master - DSS Management tasks vs levels

Dear Friends,
I’m here today to update an old post about management tasks and levels. And I’m doing that, because at this moment I have a better perspective of what is Business Intelligence and what is the relation with management different tasks on different levels inside a company.
http://pedrocgd.blogspot.com/2007/11/bi-master-management-tasks-and-sub.html
So, Plan and Control are very important but doesn’t make sense without other tasks like Leadership, Organize and Act.


A plan is a process that helps to decide! To decide what it must be done, for who and when it must be done. To plan we have to determine well the objectives that must be realistic, measurable and complete. Having the objectives well tracings, the planning goes to guide us into the final objective, exceeding intermediate obstacles, making to expend us little time, energy and resources to reach the same final.

A great DSS has to be 100% under the control of the manager who is the main responsible for taking a decision, he is the decision maker, and the same time he’s the lead and he must put all the plans in action.

Sunday, December 16, 2007

BI Master - News

Dear Friends,
It has much time that I did not write in my blog… sorry for that, but last weeks was terrible for me with lot of work not only in my current job but also with my fantastic master BI degree.
I have a lot of things to write and to share with you… after next Thursday I will be more available to write here and to share you with some interesting subjects.
Kind regards!

Tuesday, November 20, 2007

OFF-TOPIC - Help someone in the right moment

Something like Business Intelligence...

"LEIA O TEXTO ABAIXO ANTES DE VER O V Í D E O :
A menina, 13 anos, ganhou um prêmio e foi cantar o Star Spangled Banner,hino dos EUA, no jogo da NBA.Vinte mil pessoas no estádio, ela afinada e compenetrada.
De repente o braço tremeu, ela engasgou, esqueceu a letra... DEU BRANCO!!!Treze anos. Sozinha, ali no meio...O PÚBLICO ESTUPEFATO ameaça uma VAIA...
Num repente, Mo Cheeks, técnico dos Portland Trail Blazers, aparece ao seu lado e começa a cantar, incentivando-a, e trazendo o público junto.Bonita CENA e - o que é mais incrível - ... Só o técnico tomou a iniciativade ir até lá para ajudar, enquanto os demais à volta dela só observavam estupefatos...
Mostre como uma atitude de liderança e solidariedade, NA HORA CERTA, pode fazer uma grande diferença, para ajudarmos um ser humano e mudar a história do JOGO da vida .Será que isso já não aconteceu nas nossas vidas? E a nossa atitude foi a do técnico Mo Cheeks ou da de todos que estavam ao redor, comum e de descaso? TEM GENTE QUE ESTÁ NO MUNDO PARA AJUDAR... OUTROS PARA VAIAR. PENSE NISSO. AGORA VEJA O FILME..."



Received by e-mail

Thursday, November 1, 2007

OFF-TOPIC - Google ADSense (1 Month Analysis)

Dear Friends,
In this post you can see where visitors of this blog come from and how many they are in the last month. And the main reason for write this blog in English is to describe my master degree experiences and my current job as BI developer (problems and solutions) for the maximum people. (I’m receiving visits from 36 different countries)
I think writing the blog in english could be a better way to achieve one of my life objectives: back to work in England (and this time not for study but for work as BI Analyst/Developer), maybe back to work in Italy or work in Spain (Barcelona).


"Life is too short to spend time in routine tasks!" :-) Made by me!





Regards,

BI Master - Management Tasks and Sub-Tasks

Few managers realize that a company plan must provide the framework for the company control system. If missions, goals, strategies, objectives, and plans change, then controls should change. Although this error occurs at the top, repercussions are felt at all levels.

The picture below show how management level are related with management tasks, a cross join between planning and control and the different levels of management.


Planning and control are the two most closely interrelated management functions as our teacher told us in the last class.
The planning function entails deciding today what to do tomorrow. The planning horizon could be several years, one year, few weeks or a day (example is my current project that gives trader information like bonds, equities, cash flows and other instruments)

Once the plan is implemented, the control function takes over for assuring the accomplishment of the plan’s objectives. This requires actual outcomes to be monitored and evaluated, and corrective action must be taken when the plan deviate from planned objectives. The corrective action will be taken by tactical level and a revision in the original objectives could be done.

Sunday, October 28, 2007

BI Master - Attributes of Decision Process

The attributes of a decision process is represented in the decision-data pyramid, which shows the transformation of ‘Data’ to ‘Infor­mation’ to ‘Knowledge’ to ‘Decision’ with rapidly increasing ‘value or worth’ as we move from the bottom of a decision pyramid to the top. Data by themselves have little value or worth; once they are processed, analyzed, and organized, they become information. Once the information is processed by the user to develop and enhance understanding of a situation or a problem, it turns into knowledge. It is this knowledge that can reduce the risk of undesir­able consequences of a decision.

Operational decisions involve day to day operations, dealing with problems that often become well known, such as how many units of a product to stock, or what employee to assign to what task. These decisions usually are repeated quite often.

At the other extreme, strategic decisions involve the long-range plan of the organization. Strategic decisions occur much less often then operational decisions, but affect the entire firm, may involve large sums of money, and there is inherently more risk involved in making them.

Tactical decisions are usually required more often than strategic decisions, but not as often as operational decisions. Middle level managers usually have more discretion than operational decision makers, but are constrained by strategic plans.



Strategic Level
Organizational vision and mission
Organizational policies
Target markets
Long-range goals
Future-oriented

Tactical Level
Middle management
Mid-range goals
Effective resource allocation
Budgeting, for example.

Operational Level
First line management
Short-range goals
Day to day activities (efficiency)

There are persons that are so focused in the goals and gives so high importance to the methodologies that a problem leaves of being a problem.
The classic decision-data pyramid will be transformed in a losangle, because the solution to resolve a problem will be repeated and is converted in an automated process, as you can see in the following picture:


Following Gorry and Scott Morton approach, the degree of structure for a problem changes as decision makers learn more about problems.

And each problem could be classified from different levels:



This picture demonstrates well the problem decision, and following my teacher homewrok request I will describe in the next post, the most important level - management level, tasks and sub-tasks.

Tuesday, October 23, 2007

BI Master - DSS Concept and Components

Dear Friends,
Last week was very hard for me. I was dealing with a big problem with my data warehouse design that I was described in the previous post, and consequently I was sick for two days. With the support of my data warehouse teacher, Patricia Narciso, the problem is currently resolved and I will give you all the explanation in the next post.


Now that I already excused myself, I will start write a little more about the concepts that I have been collecting with my BI master degree and some books that I was reading and indicated by my SIAD teacher, Maria José Trigueiros.

The subject is Decision Support Systems, DSS. Do you know what it is?! If not, don’t worry, I only heard these words few weeks ago… Everything has a begin date!

The study of DSS is really about people, about how people think and make decisions, as well as how they act on and react to those decisions. There isn’t an universal definition about DSS, but there are several investigators in this subject that I’m agree with.

“DSS is an interactive computer-based systems, which help decision makers, utilize data and models to solve constructed problems” (Scott Morton, 1971)

The DSS investigators better knowned in my master degree classes and the authors of some decision making best seller books are Herbert Simon, George Marakas, Samuel Bodily and Covey. I know there will be more experts in this subject but in this post I will focus on the Simon three phases decision making process and in the DSS components.

DSS components



Decision Making
A process of choosing among alternatives courses of action for the purpose of attaining a goal or goals.

The DSS Goal
Increase the effectiveness of decision making.

Will continue........

Sunday, October 14, 2007

BI CaseStudy - Slowly Changing Dimension (Type2)

The Problem
Store historic data in a relation ManyToMany between two Dimensions.

The Scenario
For each day I need to import via ETL about 5.000 rows of financial movements into a FactTable.
This data must be viewed by several dimensions: Entidade, Carteira, Mesa, Folder and FolderGroup. Each dimension has several variables that is related only by the dimension, and these variables must me store along time. So I need to store historic data for each dimension and In this project there is two important areas:
1. Entidade, Carteira, Mesa and Folder is a typically a Slowly Changing Dimension (Type2) and these dimension are structured as a snowflake. (Yes… I know Kimball doesn’t recommend the use of snowflakes, but I think for this case the data could be better understandable and the data model will be more easy to maintain in the future)
2. FolderGroup is a ManyToMany relation between Dimension Folder and Dimension Group and a Bridge to link the both.

Now, I’ll show the current structure and then I’ll explain my problem.
As I told you before, my problem is that I cannot create a Slowly Changing Dimension (Type 2) when there is a relation ManyToMany between Dimensions.
In my project I have the typically SCD (type2) that is working good and I fell data model very “clean” and understandable.



Sample Data from Slowly Changing Dimension (Type 2)
I divided Entidade on Entidade and EntidadeHistoric because the attributes never change and by this way I’ll avoid redundant data in the dimension. If for any unexpected reason I need to change ENT_Name or ENT_RelatorioKVaR, it will be a typically SCD (Type1) and I’ll do a classic update to the dimension row. The other reason of this division is that I only need to save historic data for hENT_RacioCooke. (I have other fields to store but for this example I’ll omit it)

The value 9020 is the Foreign Key to the Time Table and correspond to the date 10-10-2003.
So, from this picture we can see that for Lisbon, the field hENT_RacioCooke, between date 8888 and 9020 is 0,5 and between 9021 until now is 0,1.

Sample data for my Snowflake schema
I decided to create the relations below as Snowflake, because I felt that could be a better way to maintain the data in the future.
The business keys from each level (ENT_ID, CRT_ID, MESA_ID and FOLDER_ID) are in a string format to be coherent with the source data of ETL and to be more comprehensive for the admin users that will be analyzing data directly to OLAP and OLTP databases.


Based on the images posted before, I will show you how the SCD (Type 2) is working on my project.
I decided not include all the structure and focus only in the essential data to understand my pending problem.


As you can see in the above picture, this is my SCD (type 2). I’ll test if this structure is working good creating the follow SQL query:

And the result is:

Imagine you need to add a new field to the table Dados with the formula : D_MeasureX_ENT=D_MarketValue*hENT_RacioCooke.
For the same D_MarketValue value (10.000) in the days 9019 and 9023, the new field D_MeasureX_ENT is different because the RacioCooke between 8888 and 9020 was 0,5 and between 9021 until now is 0,1.


OK.. now that you understood my project, I will start describing my real problem. All I write above could be changed and discussed with us, but for now is working perfectly and for me is urgent resolve my current problem… but I accept suggestions for it!

In the project I have to group folders in groups and to stores the historic as I did before for Entidade, Carteira, Mesa and Folder. I remember you that each historic depends on each dimension, and I need to group 3 or 4 folders into a group and this group has a specific RacioCooke as Entidade, Carteira, Mesa and Folder. And the problem is that a folder in fact Dados could be in more than one group.
While in the SCD described before there is a relation OneToMany, and you could store your surrogateKey (Foreign Key) in the Fact Dados, in this case you couldn’t because there is a relation ManyToMany. I tried to use the structure below, but when I tested, didn’t work because it would create duplicate records in my fact Dados.

The Big Question
What is the hGRP_RacioCooke for the row 2131 in Fact DADOS for GRP01 (Investment) and for GRP02 (Negotiation)?

The Solution
I already found the solution and I will explain that later.
Kind Regards

Sunday, October 7, 2007

BI Tools - First look at Microsoft BI Tools

Dear Friends,
Following a suggestion of Rui Santos (master colleague), I'll try to write a draft about Microsoft BI Tools using my current BI project.
As you probably know, the current version of SQL Server 2005 has two separate environments, one for management and one for development.
The development environment is called Business Intelligence Development Studio (BIDS) and is focused mainly in business intelligence area. Includes a module for Data Integration (SSIS), another for Data Analysis (SSAS) and another for Report (SSRS). The management environment is called SQL Server Management Studio (SSMS) and it’s here that you do all the work managing your databases.

I don’t know yet the other BI tools, but what I heard from community is that Microsoft tools are more “user-friendly” than others. But in my master I will try SAS tools and I will know if it’s true or not.
I’m very satisfied with Microsoft, and If today BI is more popular is because Microsoft is investing a lot in BI technologies. Why I am say this? Well… is my opinion, but there was big improvements particularly in SQL 2005, SharePoint 2007 (usually used to file share in 2003 version), the acquisition of ProClarity (tools to analyze BI data), the acquisition of Dundas software (Report tools), Office 2007… and others…


For these reasons, I think BI is the right path to our careers, particularly Microsoft! But I think if we know the BI concepts, with some extra work we can use other tools like SAS, WebFocus, SAP and others without problems.

My current BI project is very complex and the old system was too difficult to understand (http://pedrocgd.blogspot.com/2007/06/ssis-temporary-image-to-msdn-forum.html) because it seemed like a data warehouse inside excel and access files… could you imagine that?!
In overall, the big difficulties was understanding the old system and define the best architecture.

Below I will show you some screenshots of Microsoft BI tools in my project.

SSIS - SQL Integration Services
As I wrote in a previous post, the goal of BI is provide the business with high data quality using intelligence! This module provides the right tools to transform, clean and audit data from several data sources and unsure that this data will be inserted correctly and compatible with destination structure – Is my concept of ETL.

Control Flow

Control Flow (Sequence container expanded)


SSAS - SQL Analysis Services
This module consisted of two major, and complementary, pieces of functionality: OLAP and Data Mining.
In this module you can add KPI (corporate measures), perspectives, translations and you must know the language of querying multi dimensional databases, MDX.
MDX is similar to SQL but while in SQL we query data from OLTP databases, with MDX we query data from OLTP databases.


Data Source View

Cube Designer

Calculations Designer (MDX Sintax)

Cube Browser

Cube Browser with some data


SSRS - SQL Reporting Services
Using the MDX, it's possible to create queries to an OLAP database and display here inside a table, matrix or in a chart.

Data designer (wizard)

MDX Data Designer

Report Designer


Don’t forget, this is a draft about each SQL BI module. If you think a more specific description of this subject could be useful, I will describe it better. (In the right panel you could see links of my preferred books)
Regards!

Thursday, October 4, 2007

BI CaseStudy - SSIS Package v2

Dear Friends,
In a previous post I described the first version of my current project main SSIS packages.
http://pedrocgd.blogspot.com/2007/06/ssis-packages-structures-v1.html

Today, I will describe how the main package changed. I‘ll probably do some changes to this version, depending on your feedback.
The majority of this objects are sequence containers objects and each of them has several steps inside.





This package will be executed in 95% of the cases for each day and it will import data of several data sources like SQL Server and Sybase. Sometimes the package could be executed not only for one date but between two dates to correct an unexpected error or for other unexpected reason.

1. SEQ VALIDATE Surrogate Keys
Step to validate my surrogate keys. (This step will be explained in a future post)

2. SEQ GET Configuration
Get the start and end date from the database table Configuration. These dates could be manual (stored in DB) or automatic (getting last date from main fact table Dados)

3. SEQ VALIDATE LastRunDate DADOS
Step to avoid inserting duplicate rows from an existent day in FactTable Dados

4. SEQ VALIDATE LastRunDate CASHFLOWS
Step to avoid inserting duplicate rows from an existent day in FactTable CashFlows

5. SEQ IMPORT SpotRates
Step to import currency rates based on Euro.

6. SEQ TRUNCATE Tables (Temporary)
Step to truncate tables. I will improve this step soon.

7. SEQ IMPORT TO FactTable CASHFLOWS
Step to Import data into FactTable CashFlows, FactTable BondsCouponLiq e BondsAmortLiq

8. SEQ GET IDENTITY Initial Values
Step to GET initial dimension identity values.
http://pedrocgd.blogspot.com/2007/05/ssis-populating-dimension_28.html

9. SEQ IMPORT TO FactTable DADOS
Step to import data into FactTable Dados and populate dimensions CounterParty and Instruments. Inside the sequence container there is several dataflows to import data into FactDados for each instrument like Equities, Bonds, Swaps, CapsFloor, Credit Swaps…

10. SEQ UPDATE FactTable DADOS
Step to update several fields like basel weight and risk into the rows inserted in the previous sequence container

11. SHRINK Database dbRentabilidade
Step to compact space in the database

12. SEND MAIL with SSIS Report
Step to send an e-mail with SSIS Report.


So, what improvements you think that could be done? Feel free to send feedback!!
Regards!

Thursday, September 27, 2007

BI Events - SAS Portugal

Next november, it will be a big event about Business Intelligence and It will be organized by SAS Portugal. The BI comunity think that will be the bigger BI event in Portugal.

...And of course, I will be there!

See further details in the link below:
http://www.sas.com/offices/europe/portugal/eventos/sfp2007/index.html



Regards!

TPC02-SIAD - BI and decision-make definitions

As my teacher told us in the last class, business intelligence is putting intelligence in the business of an organization.

For me, the main purpose of BI is to provide information that support decisions and provide all the indicators that allows me to know If I took the best decision or not. BI filter out irrelevant information and focus only in the information that user needs.

I don’t need to be a decision-maker to know that for take a decision I must have value data and intelligence to decide. This intelligence to decide could be for example when you have value information and understand some information like market changes, risk assessment, poor employees performance… and can take preventive decisions in order to reach the organization strategy.

Giving some time to “gurus” of business intelligence I found an article that refers some relevant comments from Herbert Simon relating to decision-making.

“It is work of choosing issues that require attention, setting goals, finding or designing suitable courses of action, and evaluating and choosing among alternative actions. The first three of these activities (fixing agendas, setting goals and designing actions) are usually called problem solving; the last, evaluating and choosing, is usually called decision-making.” in TDWI website




“All four steps of the decision process elaborated by Herbert Simon and many others can be satisfied with analytics software that allows people at any skill level to perform the required operations. Super users will always have a role, and most people in an organization will never develop skills or interest in pursuits such as stochastic processes or simulation, but framing a problem and building a model can be a very simple process. Today, most BI efforts are driven by data, not by models, and the user interfaces, best practices and training are aligned with this approach. Toppling the pyramid means breaking through the data-only model and finding ways to distribute models and applications that can be used and shared by everyone.” in TDWI website

Tomorrow I will try to add additional comments.
regards

Tuesday, September 25, 2007

OFF-TOPIC - LinkedIn

Dear Friends,
Do you already know LinkedIn?! If not, go to http://www.linkedin.com/...
Is a website where you can store your profile and found old collegues and receive job notifications!


Add your profile today and and me as your connection!
Regards!

Thursday, September 20, 2007

TPC01-SIAD - Master HomeWork

Dear Friends,
This subject is dedicated to my master homeworks and it will be written in portuguese.
Let's do it...

Todos os dias me deparo com novos problemas e como consequencia novos objectivos. Faz parte da vida o aparecimento de novos problemas, nós própris por vezes parece que andamos á procura de problemas, mas sem problemas a vida fica sem objectivo e não faz sentido. Por isso mesmo a minha missão nesta vida é cumpri-los na integra e continuar a ser feliz como tenho sido até agora.
De seguida apresento o meu quadro de problemas e objectivos que foi solicitado como TPC.

Cumprimentos a todos e bom trabalho!

Sunday, September 16, 2007

BI Master - 1. Weekend

As I told you before, I started my master in “Integrated Systems to Decisions Support” . This master will be on Fridays and Saturdays.

The master in this first trimester will be focus in 3 main subjects:
- Integrated Systems to Decisions Support (Maria José Trigueiros - ISCTE)
- Management of DataWarehouse Projects (Patricia Narciso - GTBC)
- Data Analysis (Teresa Calapez - ISCTE)

I. Integrated Systems to Decisions Support
These first lessons was something seemed like a group therapy!
The main purpose was to make all the people “speak the same language”. There is some basic concepts that you think you know, but in the reality we have some difficult to give a definition to them.

It was some questions that made us to think…
What's a computer?
What's a problem?
Have all the problems one decision?
If there isn’t alternatives, there is a decision?
What's intelligence?

Feel de Gap!


II. Management of DataWarehouse Projects
In these first lessons, Patricia explained the most important differences between Bill Inmon and Ralph Kimball aproaches, and gave us a generic architecture that I think is a miscellaneous of both.

“Numerous data warehouse practitioners have questioned the different approaches to data warehousing as taught by Bill Inmon and Ralph Kimball. Religious wars have erupted between devout followers of each. As the industry has matured, the two philosophies have become much more aligned with each other. However, the language used to describe their methodology leads to questions of compatibility.”
Read full Susana Gallas article following this link
http://www.dmreview.com/editorial/newsletter_article.cfm?nl=dmdirect&articleId=1400

There is a discussion about these two approaches in the link:
http://blogs.ittoolbox.com/bi/confessions/archives/kimball-vs-inmonor-how-to-build-a-data-warehouse-10987

III. Data Analysis
The general feeling is that these lessons will be very hard.
It will be focus in data analysis with a lot of statistical techniques.
I only had a lesson in this subject and for that reason, I will resume this subject better in the next week.


Regards!

Saturday, September 8, 2007

BI Certifications

Dear friends,
Nothing is more important that some entity recognize us an experienced and certified person. For that reason, is very important to learn more and get the best certifications of the world. If someone already has it, please give us some feedback about it. I will try to do the Microsoft exams during this month, and I will post here my comments.
See the links below and read all the benefits and requirements you have.

1. MCTS: SQL Server 2005 Business Intelligence


“The Microsoft Certified Technology Specialist (MCTS) certifications enable professionals to target specific technologies and distinguish themselves by demonstrating in-depth knowledge and expertise in their specialized technologies.
Microsoft Certified Technology Specialists in Microsoft SQL Server 2005 Business Intelligence (MCTS: SQL Server 2005 BI) implement and maintain business intelligence solutions. They have thorough knowledge of the Microsoft SQL Server 2005 Business Intelligence Development tool suite. They have knowledge of database schemas and of debugging, monitoring, and troubleshooting Business Intelligence solutions, and they know how to work with Microsoft .NET Framework. They can write database queries, use SQL Server Analysis Services (SSAS) data mining algorithms, and create and deploy SQL Server Reporting Services (SSRS) reports. Additionally, they author and deploy SQL Server Integration Services (SSIS) packages. Typically, Technology Specialists pursue careers as database administrators, database developers, or business intelligence developers. They may also be developers and systems administrators who do not work with SQL Server daily but who want to show their breadth of technology experience.”
In
http://www.microsoft.com/learning/mcp/mcts/bi/default.mspx


2. Certified Business Intelligence Professional (CBIP)
“TDWI, the leading association for business intelligence and data warehousing professionals, offers the industry's most comprehensive certification program available: the Certified Business Intelligence Professional (CBIP). CBIP, a true test-based certification program, is offered in five key specialties for Business Intelligence success: Leadership & Management, Business Analytics, Data Analysis & Design, Data Integration, and Administration & Technology”
In
http://www.tdwi.org/Certification/CBIP/index.aspx

Regards.

BI Events

Dear Friends,
There will be some very good BI conferences on September and October. I would love to attend it, but unfortunately I spent all my money in the BI master that I will begin next Friday.

PASS BI (Colorado, USA)
https://www.sqlpass.org/summit/Pages/default.aspx


TDWI (Orlando, USA)
http://www.tdwi.org/education/conferences/orlando2007/index.aspx

Check the above links with conference dates, contents, prices…
I’m sure that these events will be a fantastic BI Events and next year I will try to fly from Portugal to United States to see both!!!
Regards!

BI Master

Dear Friends,
After a long period, here am I again. :-)
This post is dedicated to my BI master that I will start on the next Friday for a duration of 2 years.

Check the content of the master:
http://dcti.iscte.pt/mestrados/msiad/base/frames.htm

I’m very motivated to get this master, it will give me a very good background in the BI technology. I will try to post here some relevant themes in order to get some comments from you in the MSDN Forums or right here in the blog.
Kind regards

Monday, July 23, 2007

Blog Purpose

Dear Friends,
Yesterday, an anonymous user post in this blog and advised me to take some care about the information that I publish…
He thought that I was publishing confidential information in the examples published… he was wrong but I did not advise anyone and I agree with him, but he could have itself identified… even because I fell he works near me…
Some people does not understand the purpose of this blog, and doesn't know the importance that it has for me. No matter how little this blog is, but If it helps at least one person, the existence is justified.
And to prove it, I’m in the top ten of answerers in the SSIS, SSAS, SSRS and SharePoint MSDN Forums! Side by side with the bigger “gurus” in this subjects and authors of my preferred BI Books!!



I help the others not for be in this top, but to help others as well as they help me to find the right solution for my doubts… this is a community, and somebody does not know what it is!
I’m proud to be part of this community and to have some knowledge to share…

In my free time I will continue here and in MSDN Forums sharing my ideas!

Kind regards and good work

Thursday, July 19, 2007

SSIS - MSDN Gemma doubt

Dear friends,

I'm writing this post to try to help Gemma (from MSDN forums get the solution for her problem.






http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1871831&SiteID=1
regards!

Thursday, July 12, 2007

SSAS - Slowly Changing Values

Dear Friends,
This post could be very controversy… I hope to receive comments from you…

The Problem
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1820942&SiteID=1

The Solution
Using the Transform Slowly Changing Dimension, I think the performance would be worst (but maybe I’m wrong), and is more useful when you need the Data in the IS. So, inside of one of the best books I have in home, MDX Solutions (
http://eu.wiley.com/WileyCDA/WileyTitle/productCd-0471748080.html), I found the solution for my problem in the page 84 (Chapter3)




The data inside my FactTable that has the values to create historical data in MDX:


I hope the get comments to it!
Thanks and regards!

SSAS - Get ReferenceDate

Dear Friends,
In this post I will describe on of the last problems that I was involved.

The Problem
I have millions of rows in a FactTable with fields ID, Date and Amount… (I have more fields in this table, but for this post is irrelevant).
In AS I will have these rows aggregated by Time, so, for each day I have an amount. For each row, I need to create a calculated member for each day with the value amount of a reference date. The formula is:

Reference Date = Last Date of previous year of [My Field Date] of each row in FactTable.

The Solution
I posted my problem in MSDN forums, and I got an answer from Bryan C. Smith but didn’t resolve my problem.
He gave me an example of Adventure Works, but did not work in my project.


So, I was looking for several MDX functions, and after a lot of failed attempts I founded the solution!


In this statement, I'm using the ClosingPeriod and Ancestor MDX functions. The ancestor to get the last year, and the ClosingPeriod to get the last value of that year in the hierarchy day.


I hope this post help you!
Regards!

Sunday, July 1, 2007

SSIS - DataFlow Performance I

Dear Friends,
Today I will show you the images of the dataflow inside the SEQ Container (SEQ IMPORT TO FactTable CashFlows).
This dataflow import data from SQL Server 2000 and insert it in a SQL Server 2005. Each of this datasource are from different tables. If your SSIS server does not have sufficient RAM, maybe could be better to separate this dataflow in 2 or more.




Give me your feedback!
Regards.

Saturday, June 30, 2007

BI CaseStudy - Packages Structures v1

Dear Friends,
Here I show you my version 1.0 of SSIS in my current project. You can see from the images the 3 packages that I will use. Inside each sequence container (SEQ) you have some complex tasks, but for now is to early to describy all for you.

1. Package


2. Package


3. Package


Regards,

Thursday, June 28, 2007

BI CaseStudy - Current and proposed system

Dear Friends,
In the previous posts, I was talking about the project but this images can show you a better understanding to it...
I'm alone developing this project since the step of the requirements until the final step in sharepoint and proClarity. So, I don't have a lot of time, but I'll always be trying to show here some information. I made this documents in Visio to show to the customer:

Current System


Proposed System


Regards!

Friday, June 15, 2007

SSAS - Problem&Solution - Get Diference between current and previous row for valid dates

Dear Friends,
I posted a question in Microsoft Forums, and I describe here the solution founded

Problem:
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1720487&SiteID=1

Solution:

I created a view in database filtering only the valid dates in the table Time.

The datasource view in Analysis Services:

Exploring Data in TIME dimension:

Exploring DATA in CASHFLOW Dimension:

And the named calculation to calculate the difference between current and previous:

And in the brower of the cube:

Please let me know if you think that could be a better solution!
Regards!

LinkWithin

Related Posts Plugin for WordPress, Blogger...