Pages

Friday, July 31, 2020

EPM Integration Agent: Configuration tips

EPM Integration Agent was introduced sometime last year providing ability to connect to in-premise data sources. Though the tool has ability to connect to third party REST APIs via Groovy or Jython scripting but I assume the majority of the use cases would be to connect to relational databases, usually warehouses which had been used to feed data to various reporting and planning applications in the in-premise world. Unless clients move to any cloud based ERP, the data warehouses  are going to stay for a while and the need to pull data too.

Oracle provides connectivity for Oracle and MSSQL server by default, but what about other data sources? Other than scripting, if any data source offers SQL capability, we can essentially pull data. I had a recent experience connecting to Netezza database via Integration Agent and thought of sharing some quick tips I learnt during this exercise.

  • Integration Agent Port

Agent requires a port to communicate and by default it is 9090. Many organisations, specially those dealing with financials services would have their ports blocked. Moreover, 9090 is commonly used by many applications as their default port and might be already in use. Use the Telnet command  find out if the port is already in use. If not available use a separate port

  • Start agent manually as process before registering it as a service

Agent now offers an ability to register it as a windows service. When attempting to do it, the service gets installed even though the agent couldn’t register itself with cloud application. Ensure to check the agent logs. What I prefer is to register the agent manually first using the process rather than service to ensure the INI was populated correctly and agent was able to communicate to cloud application

  • Check Proxy Settings

Many organisations have proxy servers setup for external communications. Agent program is essentially a java based utility which would not be able to communicate via proxy unless proxy details are provided in the INI file. With EPM Automate, we can provide the proxy details as part of login command but with Integration Agent, this needs to be provided in INI file. Many of us tend to not go through documentation details about each and every parameter and that’s what I did myself. the INI file provides CUSTOM_JAVA_OPTIONS parameter to provide proxy details to be used when agent attempts to communicate to cloud. The proxy details can be provided in format -Dhttp.proxyHost=proxy.example.com -Dhttp.proxyPort=8080

  • Installation paths for Groovy and Jython

When installing Jython or Groovy on the machine having agent, ensure the installation path does not have spaces in it. Usually the default path for Groovy is the Program files (x86) folder with spaces in between, agent does not seem to like it and fails to register

  • JDBC drivers for source technology

Some of the source technologies would require JDBC drivers to be explicitly installed on the agent machine. This is different from the jar file copied over in the agent’s lib folder. A technology like Netezza requires JDBC driver to be installed on the machine to connect to server. Once JDBC drivers are installed, copy the jar file and place it in the lib folder.

Cheers !!!

Friday, July 3, 2020

Groovy Series 7: Submit Groovy generated rules to Essbase

Ever since Groovy functionality was introduced with Planning cloud, it has evolved over a period of time, offering new capabilities and methods to achieve what was not possible with traditional Calc script rules. Sometime, I have been asked “Is Groovy a replacement for Essbase Calc scripts” and my answer is a big ‘NO’. Groovy should not be perceived as a replacement to Calc script, rather consider it a booster which enhances user experience and provides flexibility to dynamically generate Calc scripts and improves performance.

Calc scripts are an integral part of how we run calculations on Essbase, whether we generate the code dynamically using Groovy or stick to traditional scripting. Lets look at different ways to submit these Calcs to Essbase and how Groovy has evolved.

  • Return method

This had been the oldest and default method until May 2019 when Job Factory class was introduced. Justifying its name, this method is used to submit a dynamically generated Calc script to Essbase for execution. A simple example would be to declare a Stringbuilder(), capture Position dimension members from edited cells, build a script and submit with some logic and submit it to Essbase.

image

The limitation with this method is, once called, you cannot perform any further action in the script. If I have a scenario to perform certain calculation on edited cells, aggregate data and then perform another action, I would have to break it down into multiple or at least two rules, one where I submit the groovy generated script and two perform aggregation.

  • Job Factory

This class was introduced in May 2019 update which allowed submitted an existing rule, ruleset or a template as a job. This allows making calls to existing calculations written in traditional Calc Script to be executed as part of a groovy rule. All we need to do is create a job of the calculation and submit it. It also allows passing values to runtime variable. This means we can have a groovy rule which performs certain task before groovy submits a dynamically generated script to Essbase.

image

Considering the same example, I first execute an existing rule as a job, passing the cost centre as the RTP value and then generate the script using edited positions and submit the script to Essbase. In this example, I haven’t put in the code to determine the status of the submitted job but in ideal case you would move ahead once the status is complete.

  • executeCalcScript method

This one is the latest addition allowing a much flexible way to submit calculation to Essbase and continue performing further tasks without breaking it into multiple rules. This method belongs to the Cube class allowing us to execute the dynamically generated code on that specific cube.

A simple use case would be

  1. Capture the edited cost centre
  2. Perform certain calculation on the edited members in Cube1
  3. Push data from Cube1 to Cube 2 using smart push
  4. Perform another calculation on Cube2 on same edited cost centre

All of this is performed in a single calculation script.

image

All these methods offer flexibility to design a dynamic and efficient solution with full control on members that should be enacted upon and achieve desired performance.

Cheers !!!