Showing posts with label cube. Show all posts
Showing posts with label cube. Show all posts

Friday, March 23, 2012

Problem processing new cube

All,
When I try to "Design Storage" for a new cube, I get
a "Fact table size can not be 0" message. (I am in the
Cube Editor...) If I focus on the fact table and
select "Browse Data", the data is displayed as it should
be. So, the fact table size clearly is not "0".
From Query Analyzer, I can select rows. The table has over
one million rows in it.
I can't see anything obvious, such as "Refresh connection"
that would seemingly rectify this problem.
Any help appreciated.
TIA,
MikeIn the cube editor, set a value for the Fact Table Size property in the
Advanced tab of the property panel. This value along with the sizes of
dimensions is used during the aggregation design process.
- Matt Carroll
--
This posting is provided "AS IS" with no warranties, and confers no rights.
"Mike" <anonymous@.discussions.microsoft.com> wrote in message
news:013901c3c58c$86ab1c40$a301280a@.phx.gbl...
quote:

> All,
> When I try to "Design Storage" for a new cube, I get
> a "Fact table size can not be 0" message. (I am in the
> Cube Editor...) If I focus on the fact table and
> select "Browse Data", the data is displayed as it should
> be. So, the fact table size clearly is not "0".
> From Query Analyzer, I can select rows. The table has over
> one million rows in it.
> I can't see anything obvious, such as "Refresh connection"
> that would seemingly rectify this problem.
> Any help appreciated.
> TIA,
> Mike

Problem processing cubes in example Adventure Works DW

Hi,

I've installed the sample databases, and I am trying to process the cubes in the sample.

I keep getting errors like this, for cube "adventure Works":

Errors in the OLAP storage engine: The attribute key cannot be found: Table: dbo_DimCustomer, Column: AddressLine1, Value: 8011 Mcnutt Ave. Errors in the OLAP storage engine: The record was skipped because the attribute key was not found. Attribute: Customer of Dimension: Customer from Database: Adventure Works DW, Record: 1024.

I have set processing to ignore errors, but the cube is never successfully processed.

What can I do to get the cube processed?

Hi John,

Could you please check that you have installed the right AdventureWorksDW relational DB, and that your AdventureWorksDW AS DB's datasource points to it.

YL

|||

Yan,

The "Adventure Works DW" AS DB datasource points to the "AdventureWorksDW" database installed as sample (on the same machine), going by the connection string.

How can I check that I have the "right" AdventureWorksDW?

John

|||

John,

If you installed the sample Adventure Works DW successfully, and verified that the relational DB is also installed and no records in it were removed, the processing should work.

Yan

|||

I've fixed the problem. I had SS 2000 installed as my default instance and 2005 as a named instance and it appears that was the source of the problem. I've had to uninstall both, then reinstall only 2005, but the sample is now working.

Problem processing cubes in example Adventure Works DW

Hi,

I've installed the sample databases, and I am trying to process the cubes in the sample.

I keep getting errors like this, for cube "adventure Works":

Errors in the OLAP storage engine: The attribute key cannot be found: Table: dbo_DimCustomer, Column: AddressLine1, Value: 8011 Mcnutt Ave. Errors in the OLAP storage engine: The record was skipped because the attribute key was not found. Attribute: Customer of Dimension: Customer from Database: Adventure Works DW, Record: 1024.

I have set processing to ignore errors, but the cube is never successfully processed.

What can I do to get the cube processed?

Hi John,

Could you please check that you have installed the right AdventureWorksDW relational DB, and that your AdventureWorksDW AS DB's datasource points to it.

YL

|||

Yan,

The "Adventure Works DW" AS DB datasource points to the "AdventureWorksDW" database installed as sample (on the same machine), going by the connection string.

How can I check that I have the "right" AdventureWorksDW?

John

|||

John,

If you installed the sample Adventure Works DW successfully, and verified that the relational DB is also installed and no records in it were removed, the processing should work.

Yan

|||

I've fixed the problem. I had SS 2000 installed as my default instance and 2005 as a named instance and it appears that was the source of the problem. I've had to uninstall both, then reinstall only 2005, but the sample is now working.

Problem processing cube partition

I am getting the following error when trying to process a partition, as if
the process is trying to overwrite the database. How can I fix this?

TIA

Error 1 Errors in the metadata manager. The database with the name of
'ACRPROD_OLAP' already exists in the 'MBPENTSQL01R\ACR' server. 0 0

Are you deploying from BIDS? Did you rename a database that was previously deployed? The following post might shed some light. You might check the Name and ID properties of each deployed database and see whether any have the ID of ACRPROD_OLAP regardless of what Name they have.

http://geekswithblogs.net/darrengosbell/archive/2007/06/06/BIDSHelper-DeployMDXScript-feature-enhanced.aspx

Easy fix... delete the conflicting database off the server. More complex fix... instead of renaming a database on the server, backup the database, delete it, then restore it as the new name... that will make sure the new name matches the new ID.

sql

Problem processing cube

I am getting the error:

Warning 2 Errors in the OLAP storage engine: The record was skipped because the attribute key was not found. Attribute: B@. - key@. of Dimension: Brcledger from Database: Misys, Cube: Misys 1, Measure Group: Brcledger, Partition: Brcledger, Record: 24163. 0 0

I have read that this is caused by a referential integrity problem however, the fact table is the dimension table, and the joint primary key is definitely unique. There doesn't seem to be anything special about record number 24163 So why can i not process my cube?

Martin

Just incase it is any help.. the other error message which comes up I have pasted here:

Warning 1 Errors in the OLAP storage engine: The attribute key cannot be found: Table: dbo_brcledger, Column: B_x0040_, Value: 0; Table: dbo_brcledger, Column: key_x0040_, Value: 474142463032485130310000000C. 0 0

SELECT *
FROM brcledger
WHERE (key@. = '474142463032485130310000000C')

Returns 1 result

|||

You have a situation when Analysis Server cannot find a key in the dimension while processing partition.

I would guess you might have problems with dimension process that lead to partition processing errors.

Try re-process your Brcledger dimension change processing options to report and stop on every error.

After you fix dimension problems you should be able to process your partitions without a problem.

Edward.
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

I have the same problem but it only occurs when i define my dimension as a time dimension. ]

Also my fact table is a view which gets it's surrgate keys by joining to the dimension tables in its definiton. This might be connected but the cube processes fine with the same data source, fact table, dimensions etc as long as i dont define the dimension as a time dimension.

Also the value of the attribute keys it cant find is always zero '0'. There are the same number of errors in the processing as there are nulls in the key column of the fact table. So i assume the nulls are being converted to 0 and then the referential integrity goes.....

Please help,

thanks,

Wednesday, March 21, 2012

Problem opening local cube with ProClarity 6.2

I am new to BI and have a perplexing problem. I need to create a local cube for offline use in ProClarity 6.2 for mobile users.

I have created a sample cube in AS2005 using the following code:

CREATE GLOBAL CUBE [RainbowBranch]

Storage 'C:\SourceData\RainbowBranch.cub'

FROM [Rainbow Trade Sales]

(

MEASURE [Rainbow Trade Sales].[Adj Sales Value],

DIMENSION [Rainbow Trade Sales].[Branch]

)

The resultant cube can be opened in Excel 2003 with no problems, but when opened in ProClarity, there is no cube name to select, so I can never get to see the cube data.

Is there a problem with my cube build statement? Does it need more in terms of parameters, etc.

Regards,

Charles.

Hello. The coming 6.3 version of ProClarity, supposed to be released any day, will have strong improvements for local cubes. You can have a problem with a bug in 6.2

HTH

Thomas Ivarsson

|||

Thanks Thomas,

I heard about the forthcoming ProClarity 6.3 release. I believe that it will only be available from our vendor at the end of March, which is a problem in terms of making decisions at this point in time. Anyhow, back to my current problem; A colleague sent me a cub file to test, the cub is named "Debtors Analysis" - I think it is part of sample data but I do not know how it was generated (AS2000 perhaps or ProClarity 5.3 ?). This "Debtors Analysis" cub opens perfectly in both ProClarity 6.1 & 6.2 as well as in Excel 2003. This leads me to believe that my problem lies with my cub file - either something missing in my cub file which causes ProClarity to fail, or something extra in the cub file which ProClarity does not recognise.

Are you aware of any other way to create cub files that I can test in the meantime. Preferably not using another 3rd party, commercial product. Thanks.

Regards,

Charles.

|||Have you tried looking on the Proclarity community bulletin boards... There are many similar posts|||I have the same problem to open local cube.

Problem opening local cube with ProClarity 6.2

I am new to BI and have a perplexing problem. I need to create a local cube for offline use in ProClarity 6.2 for mobile users.

I have created a sample cube in AS2005 using the following code:

CREATE GLOBAL CUBE [RainbowBranch]

Storage 'C:\SourceData\RainbowBranch.cub'

FROM [Rainbow Trade Sales]

(

MEASURE [Rainbow Trade Sales].[Adj Sales Value],

DIMENSION [Rainbow Trade Sales].[Branch]

)

The resultant cube can be opened in Excel 2003 with no problems, but when opened in ProClarity, there is no cube name to select, so I can never get to see the cube data.

Is there a problem with my cube build statement? Does it need more in terms of parameters, etc.

Regards,

Charles.

Hello. The coming 6.3 version of ProClarity, supposed to be released any day, will have strong improvements for local cubes. You can have a problem with a bug in 6.2

HTH

Thomas Ivarsson

|||

Thanks Thomas,

I heard about the forthcoming ProClarity 6.3 release. I believe that it will only be available from our vendor at the end of March, which is a problem in terms of making decisions at this point in time. Anyhow, back to my current problem; A colleague sent me a cub file to test, the cub is named "Debtors Analysis" - I think it is part of sample data but I do not know how it was generated (AS2000 perhaps or ProClarity 5.3 ?). This "Debtors Analysis" cub opens perfectly in both ProClarity 6.1 & 6.2 as well as in Excel 2003. This leads me to believe that my problem lies with my cub file - either something missing in my cub file which causes ProClarity to fail, or something extra in the cub file which ProClarity does not recognise.

Are you aware of any other way to create cub files that I can test in the meantime. Preferably not using another 3rd party, commercial product. Thanks.

Regards,

Charles.

|||Have you tried looking on the Proclarity community bulletin boards... There are many similar posts

Tuesday, March 20, 2012

Problem on Cube processing

Hi, all,

Thanks for your kind attention again.

I encountered a strange problem-whenever I process the cube(which is a small cube only, as I have already selected its partitions based on date dimension and the whole cube only with less than 1000 rows records), but I always got the problem which runs out the system drive space and of course resulted in the failure of the processing of the cube. I have actually manually allocated the storage locations which are not on the local system drive? Why did that problem happen and how can I figure it out?

That's really frustrated problem and please give me any advices and help on this if you have have any ideas what is going on for this.

I am looking forward to hearing from you and thanks a lot in advance again.

With best regards,

Yours sincerely,

What AS do you have AS2K or AS2K5?

Please, take attention to the queries that are send to data source sql server. I think you have somewhere cartesian product in your cube design that leads to the data explosion.

|||

Hi,

Thanks for your kind attention.

I am running AS2005. Yes, I have found the cause of the problem, as the fact table where the measure group is from is too big with over million records, thus even though my cube has been partitioned with only less than 1 thousand records, as the processing always need to query into the whole fact table still which thus resulting in the running out of the system drive space (as I am doing some test with my virtual machine at the moment only, which only has a poor system drive storage.)

I have then particularly partiton the fact table with only a small proportion of data for the cube, and now the processing is ok then.

Thanks again.

With best regards,

Yours sincerely,

problem olap process

Hi,
I have a cube in analysis manager. This cube is processed via a DTS in sql server 2000 with an incremental update.
The sql task finishes successfully but there is no data moreover in the cube.
So, I proccess manually the cube with the dialog box "process a cube" and I select Incremental Update : the step finishes successfully but there is still not data moreover!!
I have to process the cube with a full process each time!!
Why that doesn't work?? any ideas??There is no quick way to resolve but you may try review this http://www.databasejournal.com/features/mssql/article.php/1582491 link to troubleshoot the issue.

HTH