Recent Posts

Xwrite and Xref functions in Essbase

How to use XWRITE and XREF in Essbbase

The main intension of writing this Post is to describe how @XREF and @XWRITE can work in similar way.
Sometimes you will be in the need of writing or copying the data between cubes either they can be from within the application or they can be from remote server application.
@XWRITE and @XREF are two calculation commands can be used for such operations. 

Using @XREF:
The @XREF function pulls the data to the TARGET cube so your business rule / calc script will be based in the TARGET cube and the FIX statement will be based on the TARGET cube dimensions.
Set your FIX statement to include all of the common dimensions and select the members you want to use for any unique dimensions in the TARGET cube or members which will differ between cubes.  In this example the Composite dimension is unique to the TARGET cube and the version dimension is common between the two cubes but we are moving the data from Draft2 in the SORUCE cube to Draft1 in the TARGET cube.
@XREF Syntax:
FIX (Common Dimensions, Unique TARGET Dimensions, Common Dimensions but Different Members selected for TARGET cube)
Open TARGET Account block(
@XREF(“SOURCE Cube Alias”, SOURCE Account member, “Members from unique SOURCE   cube Dimensions”);

Using @XWRITE:
The @XWRITE function pushes the data to the TARGET cube so your business rule / calc script will be based in the SOURCE cube and the FIX statement will be based on the SOURCE cube dimensions.
Set your FIX statement to include all of the common dimensions and select the members you want to use for any unique dimensions in the SOURCE cube or members which will differ between cubes.  In this example the MarketSector dimension is unique to the SOURCE cube and the version dimension is common between the two cubes but we are moving the data from Draft2 in the SORUCE cube to Draft1 in the TARGET cube.
@XWRITE Syntax
FIX (Common Dimensions, Unique SOURCE Dimensions, Common Dimensions but Different Members selected for SOURCE cube)
Open SOURCE Account block(
@XWRITE(SOURCE Account member, TARGET Cube alias, TARGET Account member, Unique TARGET Dimension members);

For example you have two databases(cubes ) called A1 and B1 and they both share different outline structure like below.
Cube :: A1 Outline looks like below.
Cube :: B1 Outline looks like below. 
@XREF: For example East_Sales->Jan->Dept_101->2011intersection of B1 cube has to get data fromsales->Jan->East->Budget->2011 intersection in the A1 cube. Nothing but we are copying the data fromA1 cube to populate data in B1 cube .
Below calculation script can be referred in such cases,which is written under B1 cube.

  • _A1alias_  is location alias name of A1 cube which acts as source for @XREF from where we get the data. B1 is called as target to where we are copying the data.
  •  @XREF always refer the data cell by taking the combination of Members names given in the Fix statement and the members given in the @XREF. Here Sales and Budget are members from A1 cube where as we still have this calculation under B1 cube .
  • Whenever we planned to get the value for some intersection by referring other cube we have to run this calculation. So whenever we run this calculation it will always get into A1 cube and search for the intersection to get data .As we know it will take some time to get the data from another cube .
  • If you want to copy more than one member from the same dimension we can write multiple @XREF statements.
  • We can alse use this command in member formulas.
@XWRITE: Even this works in the same way but source and target are interchanged here. That is we write this calculation script in Source cube that is in A1 cube in our case. So whenever you think data is ready we can run the below calculation to write the data from A1 cube to B1.

  • If you want to perform same kind of operation in any of your requirement and I would suggest using @XWRITE over @XREF because @XWRITE will write the data to the target cube whenever we have data in the source cube and it is faster .whereas @XREF always has to fetch data from other cube when we requested ,so it will take time as it depends on source cube availability and we don’t know whether we have data available for the intersection we are seeking. If we don’t have it doesn’t make any sense running the calculation .
  • When you have a situation like cube will be accessed  a lot and so many calculations might be running frequently, where as you have to populate some cells by referring other cube .If you use XREF to get those values ,it will increase your calculation time .In such situation you can go with XWRITE.
  • Using @XWRITE we can also write data to same cube itself by using @LOOPBACK function like followed statement @XWRITE(“Sales”,@LOOPBACK,”Expenses”);
  • We can also use @XWRITE command in member formula calculations.
  •  @XREF can be used for different purpose as specified in the technical reference, it has its own advantages.

Hyperion planning and essbase interview questions

  • many types of optimization you have done? can you explain about it?
  • 2.when you used UDA's and attribute dimensions, can you tell me the scenario where u used those?
  • 3.I have write access to one dimension, other dimensions i have read access, is it possible to load data into dimensions?
  • 4.what is difference between shift and relative functions and where you used them?
  • block will be created? how u calculate block size? what is your cube size?
  • 6.what is partition? what are the issue you are faced while creating a partition?
  • 7.what is relative function?
  • 8.can u explain about you are data backup process? what commands you are using?
  • you refresh your data?

Hyperion essbase optimization

Essbasse system and application performance tuning are very important tasks for a successful implementation of Essbase or Hyperion Planning. Good application design and proper system administration include some of following items.
1) Design The Outline Hour Glass Model
2) Defragmentation
3) Database Restructuring
4) Compression Techniques
5) Cache Settings
6) Intelligent Calculation
7) Data Load Optimization
8) Uncommitted Access
Design the Outline using Hour Glass Model:
We have to build the Outline should be in a way that dimensions are placed in the following order
1) Dimension tagged as an Account Dimension Type
2)Dimension tagged as a Time Dimension Type
3) Largest dense
4)Smallest dense
5) Smallest sparse
6) Largest sparse
Attribute Dimensions.
Using hourglass model improves the calculation Performance of the cube.
Fragmentation is caused by any of the following activities:
1) Frequent data loads
2) Frequent data retrieval
3) Calculations
We can check whether the cube is fragmented or not by seeing its Average Clustering Ratio in the properties . The Optimum clustering value is 1. If the average clustering ratio is less than 1, then the cube is fragmented which degrades the performance of the cube.
There are Three ways of doing Defragmentation.
Export Data of the application in to text file, then clear data and reload the data using text file without using Rules file.
1)Using MAXL Command:
Alter Database App name .DB name Force Restructure;
2)Add and Delete One Dummy Member in the Dense Dimension to force a full restructure.
Database Restructuring:
There are 3 types of Restructure.
1)Outline Restructure
2)Sparse Restructure
3)Dense Restructure/Full Restructure
Outline Restructure:
When we rename any member or add Alias to any member or member formula changes then outline Restructure would happen.
Dense Restructure:
If we want to moved, delete and add a member to a dense dimension then dense Restructure would happen.
Sparse Restructure:
when we moved, deleted, or added a member to a sparse dimension then sparse restructure would happen.
Compression Techniques:
There are 4 types of Compressions. They are
1)Bitmap Compression
2)RLE – Run length Encoding
4)No Compression.
Index Cache: Min -1024 KB (1048576 bytes) Default – Buffered I/O : 1024 KB (1048576 bytes);Direct I/O : 10240 KB (10485760 bytes) Opt -Combined size of all essn.ind files, if possible; as large as possible otherwise. Do not set this cache size higher than the total index size, as no performance improvement results.
Data File Cache:
Min – Direct I/O: 10240 KB(10485760 bytes) Default -Direct I/O: 32768 KB(33554432 bytes) Opt -Combined size of all essn.pag files, if possible; otherwise as large as possible. This cache setting not used if Essbase is set to use buffered I/O.
Data Cache: Min – 3072 KB (3145728 bytes) Default – 3072 KB (3145728 bytes) Opt -0.125 * the value of data file cache size.Calculator Cache: Min – 4 bytes Max: 200,000,000 bytes Default – 200,000 bytes Opt -The best size for the calculator cache depends on the number and density of the sparse dimensions in your outline. The optimum size of the calculator cache depends on the amount of memory the system has available.Set cache High |Low |Off; —–à command used in calc scripts to set the cache.
We set cache value for calculator cache in Essbase.cfg file.
We need to restart the server to make the changes in calculator cache after setting it in config file.
Intelligent Calculation:
Whenever the Block is created for the 1st time Essbase would treat it as Dirty Block. When we run Calc all/Calc dim Essbase would calculate and mark all blocks as Clean blocks. Subsequently, when we change value in any block the block is marked as Dirty block. when we run calc scripts again only dirty blocks are calculated it is known as Intelligent Calculation.
By default Intelligent calculation is ON. To turn off the Intelligent Calculation use command
SET Update Calc OFF;
Data Load Optimization:
Data load optimization can be achieved by the following.
1)Always load the data from the Server than file system.
2)The data should be at last after the combinations.
3)Should use #MI instead of ‘0’s. If we use ‘0’ uses 8 bytes of memory for each cell.
4)Restrict max Decimal Points to ‘3’ –à 1.234
5)Data should be loaded in the form of Inverted Hourglass Model.(Largest sparse to Smallest Sparse followed by smallest Dense to Largest Dense data) .Sorting data prior to loading in this manner will allow a block to be fully populated prior to storing it on disk .thus eliminating the need to retrieve a previously created block from disk during the load.
6)Always Pre-Aggregate data before loading data in to Database
DL Threads write (4/8): Used for Parallel Data loads. Loads 4 records at a time for 32-Bit system and 8 records for 64-Bit system.
By default Essbase Loads data Record – by – Record which would consume more time resulting in consuming huge time for data load.
Uncommitted Access:
Under uncommitted access, Essbase locks blocks for write access until Essbase finishes updating the block. Under committed access, Essbase holds locks until a transaction completes. With uncommitted access, blocks are released more frequently than with committed access. The Essbase performance is better if we set uncommitted access. Besides, parallel calculation only works with uncommitted access.

Exporting and Importing Hyperion Planning Data Forms

Exporting and Importing Hyperion Planning Data Forms

Hyperion Planning has  FormDefUtil.cmd to export and import dataforms.
The default path for version 11.1.1 is onC:\Hyperion\products\Planning\bin
Only Admin can have rights to do export and import
Step 1: open the command prompt
Step 2: Default path  is C:\Hyperion\products\Planning\bin
Set that path C:\Hyperion\products\Planning\bin press enter
Step 3: Export the data form from the application (This will export all the data forms)
The correct syntax for exporting the data form is:

Case sensitive, make sure that use F,D and U are capital letters
FormDefUtil.cmd export -all king admin Sample
here King is my server name, admin is my user name and Sample is application name
FormDefUtil.cmd export -all servername username application name
Step 4: It will ask for the password
Enter u r password
Step 5:  If it is successful, will display below message
displayName = Planning
componentTypes =
priority = 50
version =
build = 1
location = C:\Hyperion\products\Planning
taskSequence =
task =
using Java property for Hyperion Home C:\Hyperion
Setting Arbor path to: C:\Hyperion\common\EssbaseRTC\
d{ISO8601} INFO main com.hyperion.audit.client.runtime.AuditRuntime – Entering A
udit Client
d{ISO8601} INFO main com.hyperion.audit.client.runtime.AuditRuntime – Initializi
ng Manager for the server
d{ISO8601} INFO main com.hyperion.audit.client.manager.AuditContext – Creating N
ew Audit Client Instance ….
d{ISO8601} INFO main com.hyperion.audit.client.runtime.AuditRuntime – Audit Clie
nt has been created for the server
Form exported complete.
Form exported complete.
Form exported complete.
No composite form found.
d{ISO8601} INFO Thread-14 com.hyperion.audit.client.runtime.AuditRuntime– Writi
ng Audit Records for migrated Artifacts …
d{ISO8601} INFO Thread-14 com.hyperion.audit.client.runtime.AuditRuntime– Store
d Audited Data Successfully !!!
Step 6: check the data forms in xml file in the following location:
Step 7: To import to the application the following command is the same:
FormDefUtil.cmd import -all localhost admin Sample
Instead of export , we use import
Step 8:Finally check the application.

Please find some more information on export and import data forms

If you want to export all the forms that you have you can type in
FormDefUtil export –all server username application (FormDefUtil export -all localhost admin sampleapp)
If you want to export only one form, you can type in
FormDefUtil export FormName server username application (FormDefUtil export retirement localhost admin sampleapp)
The above export command creates an XML file that you can import in a different environment.
In order to import form in the target planning app, you can type in
FormDefUtil import –all server username application (FormDefUtil import -all localhost admin sampleapp)


All forms within an application:
FormdefUtil.cmd export -all localhost khoonks (username here) normal (application name here)

it will ask for password

One form that you want to export:
For example, if you want to export a form named "Other Expenses" from the "normal" application, the syntax would be 

FormdefUtil.cmd export "Other Expenses" localhost khoonks normal

It will ask for password.

After the execution of the script, the output fill will be seen in the Planning1 folder > D:\oracle\Middleware\user_projects\epmsystem1\Planning\planning1. 
Output is an XML file and it is something like "Other Expenses - Capital.xml"

In order to import only one app, you can type in
FormDefUtil import filenamepluslocationoffile server username application (FormDefUtil import formname.xml localhost admin sampleapp)
After you run the command, if it asks for the password, type in the password and wait for it to run.



Hyperion planning offline data load

Hyperion planning offline data load

1. Install the offline component of Smart View (Take help from your IT team)

The executable offline.exe can be found in HYPERION_HOME\Planning\bin\SmartView(This path can be changed based on your installation path). Double click on the executable to install the offline component.
-After install you would have an offline folder with the following directory structure


2. Provision the user for offline access(Need to login shared services and provide provisioning to that user)

In Shared Services check that the user has been provisioned with the 'Offline User' role for the Planning
application to be accessed.

3. Enable the Planning Application for offline access

To enable an application for Offline Planning, the administrator has to ensure that the Planning application property "ENABLE_FOR_OFFLINE" is set to true. This property is located within "Administration" -> "Manage Properties" -> "Application Properties"

4. Enable Forms for Offline access

In Planning, connect to the Planning Application. Select Administration, Manage Data Forms, Edit the forms to be accessed via Excel; on the Properties tab, make sure that 'Enable Offline Usage' is selected.

5.Go to smart view ,connect to your planning application and verify your provisioning.

6. Now at Excel the webform name will show-up in the take offline list.

MAXL scripts for Data load and Databackup

Hi Guys,

I am providing MAXL scripts for

1.Data Load Back up

2.Loading data using rule file

3.Data Export

4.Loading Data from back up file


Export Import Maxl

login Username Password on Hostname;

set message level error;

spool on to 'D:\Sample\log\SampleExportImport.log';

/* Let users know maintenance is in progress */
alter database Sample.Sample set note 'Maintenance is in progress - you will not be able to update';
iferror 'exitbeforecomplete';

/* Security */
alter system load application Sample;
alter application Sample load database Sample;

/* Prevent any user with a permission lower than Application designer from making connections or running commands */
alter application Sample disable connects;
iferror 'exitbeforecomplete';

/* kick out all sessions */
alter system logout session on database Sample.Sample;
iferror 'exitbeforecomplete';

/* Save level 0 data out to file */
export database Sample.Sample level0 data in columns to data_file 'D:\\Sample\\CurrentDay\\SampleData.txt';
iferror 'exitbeforecomplete';

/* Clear all data out of database */
alter database Sample.Sample reset data;
iferror 'exitbeforecomplete';

/* Load Exported file back in */
import database Sample.Sample data
from local text data_file 'D:\Sample\\CurrentDay\\SampleData.txt'
on error write to 'D:\Sample\\CurrentDay\\SampleData.err';
iferror 'exitbeforecomplete';

/* validate data headers okay */
alter database Sample.Sample validate data to local logfile 'D:\Sample\\CurrentDay\\SampleValidate.log';
iferror 'exitbeforecomplete';
alter database Sample.Sample repair invalid_block_headers;
iferror 'exitbeforecomplete';

/* Save log file */
shell copy D:Sample\\Sample.LOG 'D:\\Hyfi\\Backups\\Sample\\CurrentDay\\Sample.log';
iferror 'exitbeforecomplete';

/* Clear the application log file */
alter application Sample clear logfile;
iferror 'exitbeforecomplete';

/* Stop Application */
alter system unload application Sample;

/* Start Application */
alter system load application Sample;

/* Allow people to connect now */
alter application Sample enable connects;

define label 'exitbeforecomplete';

/* Processfailure */
spool off;

Calc script Maxl

login Username Password on Hostname;

 set message level error;

 spool on to 'D:\Sample\log\SampleCalcs.log';

 execute calculation Sample.Sample.Sdcp;

 execute calculation Sample.Sample.Sdcp1;

 spool off;


Calling the Calc Script Maxl in Batch

essmsh D:\Sample\mxl\SampleCalcs.mxl