Error: Code: 0x00000000
Source: SRC Process dimensions Error:
Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Errors in the OLAP storage engine: An error occurred while the <attribute id> attr
ibute' attribute of the <dimensionname> dimension from the '<database>' database was being processed.
Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.
End Error
I encountered this while incrementally processing a dimension via AMO script in SSIS. Does anyone have ideas on what this error means and ways to avoid?
I unprocessed and reprocessed to resolve this.|||What do you mean by "Unprocess a cube"?
How did you unprocess a cube?
Thanks.
Chris
|||Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Yikes. This has happened to me a few times within the cubes and dimensions. I also get the writefile error, which appears to be a bad reference to a file that no longer exists. I tried to use other processing methods ot process the dimension that was broken, and now I don't even get the errror anymore. Now the processing dialog box just hangs, saying that it is building a processing schedule. I do not want to unprocess the dimension, because I will have to reprocess the entire database. Does anyone have a clue as to how to avoid this, or why it happens? How could one fix it without unprocessing?
Thank,
Sally
|||That's a while ago. My problem was actually caused by the size of a dimension that I was processing.
It was just too big. Later I reduced the size, and the error disappeared.
Thanks anyway.
Chris
|||I have been experimenting with large dimensions (or at least what we consider to be large - ~100mm members), and have learned the string store (filename ends in .asstore) cannot exceed 4 gig. Is this the same thing you have run into, or have you found another limitation?
John
No comments:
Post a Comment