Quantcast
Channel: Data Services and Data Quality
Viewing all 236 articles
Browse latest View live

Schedule a job using Data Services Management Console.

$
0
0

A step by step process to schedule a job using Data Services Management Console -


  1. Log In to Data Services Management Console.
  2. Goto 'Administrator' Expand the 'Batch' menu from left hand side panel.
  3. Select the repository where you have the job.
  4. Then Goto 'Batch job Configuration' Tab.
  5. Select the project of your job.
  6. Then click on the 'Schedules' (this is under other information column) of the job for which you want to create schedule.
  7. Then click on 'Add' button. Give a name for your schedule.
  8. Select Date/Day for your schedule. Their is a option of 'recurring' if want to schedule the job in recurring fashion(Say if you want your job to be run after every 2 hours or daily at some specific time or on every Monday etc.)
  9. Provide a time for the schedule (Your job will be triggered at this time automatically once you activate the schedule).
  10. Now you are done, so click on the 'apply' button and a schedule for your job is created. (Make sure you have selected the active checkbox which is below the schedule name before you say apply).

 

 

Now go to the 'Repository Schedules' tab and here you will see the schedule you've created for the job.


Regards,

Gokul


Load no of sheets in an excel workbook

$
0
0

1. Please define two global variables one to store sheet name and second one to store total no of sheets. We will also have one more local variable for counter.

 

variables.png

 

initialize $L_SHEET_COUNT = 1 in initializing_SCR.

 

2. Now drag While and add DF in which you have made excel sheet as source. Add two more script one before the DF and one after the DF.

 

while.jpg

As you can see condition in WHILE $L_SHEET_COUNT <= $G_Total_Sheet where $L_SHEET_COUNT = 1 and $G_Total_Sheet  = 4 (because I have 4 sheets)

 

3. in Sheet_Name_SCR write below code

 

$G_SHEET_NAME = 'sheet'||$L_SHEET_COUNT;

print('Loading '|| 'Sheet'||$L_SHEET_COUNT);

 

4. increment  $L_SHEET_COUNT = $L_SHEET_COUNT + 1 in increment_SCR.

 

5. Defining File Format.

 

file_format.jpg

Right Click on Excel Workbooks, select New and then create file format. Above screen shows how can you do it.

 

Make sure you have checked use first row values as column name and choosing worksheet option and passing parameter Global Variable which you have initialized in Sheet_Name_SCR.

 

Please do like if it helps.

 

Thanks,

Imran

SAP Data Services with BO Tools & Information Steward Installation + Information Part1

$
0
0

Hi Friends,


This Blog will give you an idea about Data Services and Information Steward installation + Information


**We can install with temporary license please check below info**


IMPORTANT INFORMATION - PLEASE READ: In Service Market Place have available 90 days temporary license keys


https://support.sap.com/keys-systems-installations/keys.html

 

1.jpg


SBOP INSTALLATION



Please choose a setup language

  • Make sure that Microsoft .Net Framework 3.5 or higher is installed
  • Run setup.exe
  • Choose a language for the setup program to use during the installation process and click OK to continue

 

2.jpg

Check Prerequisite

  • The installation program checks for the required components review the results and decide whether to continue with the installation
  • Click Next to continue

4.jpg

 

Welcome Screen

  • Review the recommendations made by the installation program Welcome Screen
  • Click Next to continue

5.jpg

 

License Agreement

  • Click next to continue

6.jpg

 

Choose Language Packages

  • If need more languages, select here
  • Click next to continue

7.jpg

 

Choose Install Type

  • Choose mode of installation
  • Click next to continue

8.jpg

 

Configure Destination Folder

  • Choose the installation path
  • Click next to continue

9.jpg

Select Default or Existing DB

  • Sybase SQL Anywhere DB: Embedded database for its default CMS and Audit databases
  • Existing DB: Choose your existed installed and configured database

10.jpg


Select Java web application Server

     The Apache Tomcat serves as a “gateway” between the client (web browser) and the bi platform on a server

 

  • Select Install the default Tomcat Java Web application server and automatically deploy web applications.
  • Click Next to continue

11.jpg

Configure Version Management

  • Select Configure and Install Subversion
  • Click Next to continue

12.jpg

Configure Server Intelligence Agent (SIA)

     SIA takes over Business Objects service/server management from the CCM, when you start a SIA you can configure all, some, or none of the servers           contained in/managed by the SIA to also be started. All BO servers in a SIA must belong to the same cluster.

 

  • Enter a Node name (Installation server host name)
  • Select the SIA Default Port Number 6410
  • Click next to continue

13.jpg

Configure Central management Server (CMS)

  • Select the CMS default Port 6400:

          CMS by default uses port 6400 to communicate with Designer, WebI Rich Client, Information Design Tool, Business View Manager, application        

          server(s) and may be firewall internal to the organization. In other words CMS listens for the requests on this port, this port is referred as name                     server port. When the BI server starts all the other BI services will register with CMS on name server port, CMS then sends its request port # to                     other service to communicate after its registration with CMS.

  • Click next to continue

14.jpg

Configure CMS Account

  • Enter the CMS Administrator account password and the CMS Cluster Key as per your landscape
  • Click Next to continue

15.jpg

Configure Sybase SQL Anywhere

  • Select the SQL Anywhere default Port 2638
  • Enter SQL Anywhere Administrator account Password
  • Click Next to continue

16.jpg

Configure Tomcat Application Server Ports

  • Connection Port – A port number on which the web application server listens for incoming connections from web clients
  • Shutdown Port – Web application allows to be shutdown remotely
  • Redirect Port – automatically redirect to secure web connections request to the port number
  • Click Next to continue

17.jpg

Configure HTTP Listening Port

  • Default HTTP Listening Port number for the Web Application Container Server (WACS) to listen for incoming connections from web client
  • Click Next to continue

18.jpg

Configure Subversion

     LCM_Repository directory is in a shared directory and that you can access it from all the machines via the network share.

  • Confirm the Default Repository Port number
  • Enter the Repository Password
  • Click Next to continue

19.jpg

Select Connectivity for Solman Diagnostics (SMD) Agent

  • Select Do not configure connectivity to SMD Agent
  • Click Next to continue

20.jpg

Select Connectivity to Introscope Enterprise Manager

  • Select Do not integrate connectivity to Introscope Enterprise manager
  • Click Next to continue

21.jpg

Start Installation

  • Click Next to continue

22.jpg

Post Installation Steps

  • CMC - Central Management Console, web based administration interface for your BO Enterprise system, where one can add new users/groups, create folders, set access rights, configure SSO, configure your BO Enterprise server services, etc.
  • Installation is completed need to configure the post installation activity’s in Central Management Console (CMC)

24.jpg

Installation Finished

  • Once finished the installation need to restart the OS, then it will be effect.

25.jpg


HDB CLIENT INSTALLATION


Note: The SAP HANA Client are the "client tools" like ODBC, JDBC, SQLBC, etc.. Drivers for windows to connect to the HANA Database when you want to use for example excel to connect to SAP HANA.

 

The SAP HANA Studio is the Eclipse based development Environment for SAP HANA.

 

The SAP HANA database lifecycle manager (HDBLCM) is a wrapper tool that calls the underlying HDB tools to perform the platform LCM action. If something unexpected happens when using HDBLCM, and the LCM action cannot be completed, you can check the logs and separately run the affected underlying tools.

 

For more info, please check SAP guides.

 

http://help.sap.com/saphelp_hanaplatform/helpdata/en/c6/ce24c3bb571014bcfdf6abefff1d4f/frameset.htm

 

1577128 - Supported clients for SAP HANA

 

Please choose the hdbsetup.exe

  • Run hdbsetup.exe

26.jpg

SAP HANA Life Cycle Management

  • Choose the path and click next to Continue

27.jpg

SAP HANA provides following standard interfaces for connectivity

  • Click Install to Continue

28.jpg

Installation Finished

  • Choose Finish

29.jpg

Related SAP Notes and KBAs

1251889 - License key request for SAP BO

1285639 - What license types are available for BO Enterprise

1288121 - Temporary license keys

1566573 - Most common SAP BO license key related

1511008 - How to assign authorizations for another S-User ID

 

Note: Following in the next blog explaining about IS installation.

 

Thank you,

Srini T.

Creation of Repository, Datastore and Replication of ECC Data using BODS with SQL DB Part I

$
0
0

Hello All,

 

In this blog i am going to explain how to Create Repository, Data store and Replication of ECC Data into BODS in Two Parts.

 

In my First Part i will explain :

 

1) How to Install Data Services

2) Data Services to SAP ECC Configuration

3) Repository Creation

 

Prerequisites :


1) Need to Install BO Server / Information Platform Services for Central Configuration Manager.

 

In my case,I have installed BO Server for Central Configuration Manager.

 

Environment :

 

This tutorial has been written with the following environment:

 

OS          : Microsoft Windows Server 2008

 

SAP BO  : SAP BusinessObjects BI Platform 4.1 SP06

 

Database : MS Sql 2008

 

1) Data Services Installation Process :

 

You can find the media at : http://service.sap.com/support>Software Downloads > Installations and Upgrades > A - Z Index > D > SAP Data Services > SAP Data Services 4.2

 

Extract the media and start the Installation

 

Run Setup.exe

1.PNG

Choose Next to continue

2.PNG

Choose Next to continue

3.PNG

Choose Next to continue

4.PNG

Choose the License agreement

5.PNG

Assign the product key and choose Next

6.PNG

Choose Next to continue

7.PNG

Select the language and choose next

8.PNG

Provide the CMC Details

9.PNG

Choose Yes

10.PNG

If you choose option with default configuration the Data Services repository will be created with chosen database.

 

Here in my case i choose Install without Configuration and continue, so that i can create repository later.

11.PNG

Choose Next to continue

12.PNG

Choose Skip Configuration and proceed

13.PNG

Provide the OS credentials

14.PNG

If you want to integrate with Solution Manger choose it.

 

In my case i dont want to Integrate DS with Solution Manger, So i choose not to Integrate

15.PNG

The Data Services Installation started, Choose Next to continue

16.PNG

The below features will get installed

17.PNG

Data Services Installation got Finished Successfully

18.PNG

 

2) Data Services to ECC Configuration :


Logon to SAP ECC System that you to replicate Data


Create a TCP / IP RFC Connection

19.PNG


Assign any Program Id Ex : SAPDS_CLIENT

20.PNG

After providing the details,Save the RFC Connection

21.PNG

Open Data Services Management Console and login

22.PNG

Choose -  Administrator Tab

23.PNG

Choose RFC Server Interface in SAP Connections and click on ADD

24.PNG

Provide the TCP/IP RFC ID,ECC details and choose Apply

25.PNG

The RFC is working fine from DS

26.PNG

After creating the RFC in Data Services Management Console, check the RFC from SAP ECC in Tx-->SM59

27.PNG

Login to Data Services Designer

28.PNG

I am unable to login as there is no Repository created. So create a Repository

29.PNG


3) Process of Creating Repository :


Login into MS SQL Server Manager and create a New Database

30.PNG

 

Created DB : DS_REPO

31.PNG

Create a User for the DB : DS_REPO

 

Username : XXXXXX

Password : XXXXXX

32.PNG

Assign Permissions

33.PNG

Open Data services Repository Manager and Provide the details of DS_REPO Database and choose Create

34.PNG

After creating the Repository choose close

35.PNG

The Repository has been Created.

 

Check for the Repository Configuration,Data store Creation and Replication of SAP ECC data in Part II.

 

Hope this is Helpful...

 

Thanks..

 

 


SAP Data Services with BO Tools & Information Steward Installation + Information Part2

$
0
0

DS 4.2 SP05 INSTALLATION:


Architecture:

The following figure outlines the architecture of standard components of SAP BO Data Services.

30.jpg

  • Run setup.exe

31.jpg

Prerequisite check

  • The installation program checks for the required components review the results and decide whether to continue with the installation
  • Click Next to continue

32.jpg

Deployment platform message

  • Click Next to continue

33.jpg

Welcome Screen

  • Review the recommendations made by the installation program Welcome Screen.
  • Click Next to continue

34.jpg

License Agreement

  • Click next to continue

35.jpg

Product Registration

  • Enter the key & Click next to continue

36.jpg

Specify the Installation Path

  • Click next to continue

37.jpg

Choose Language Packages

  • If need more languages, select here
  • Click next to continue

38.jpg

CMS Configure Information: CMS for use with the ES Repository

CMS - Central Management Service, a process running as part of your BO Enterprise servers, including the CMS database, authenticating users, storing access rights, etc. The CMS is the heart of a BO Enterprise system. Provide the CMS User details

and Click next to continue

39.jpg

PW: *********

Choose - Yes

40.jpg

Installation Type Selection

  • Choose Install with default configuration
  • Click next to continue

41.jpg

Select Features

  • Choose the Features that need to Install
  • Click next to continue

42.jpg

Specify Repository Database Type

  • Choose Repository Database Type
  • Click next to continue

43.jpg

Specify JBDC Driver

JDBC driver is a software component enabling a Java application to interact with a database. JDBC drivers are analogous to ODBC drivers.

  • Browse the JDBC Driver.
  • Click next to continue

44.jpg

Repository Database Connection

Specify the connection information for the database to create repository

  • Provide Database Server Version, Server name and Port
  • Username and Password
  • Click next to continue

45.jpg

PW : *********

Login Information

  • Provide the User details
  • Click next to continue

46.jpg

PW : *********

Solution Manager Integration

  • Choose if you want to integrate with Solution Manager
  • Click next to continue

47.jpg

Feature Summary

  • Provide the Installed and Not Installed Features
  • Click next to continue

48.jpg

Installation Finished

  • Choose Finish

49.jpg

IS 4.2 SP05 INSTALLATION

The SAP BO Information Steward application provides the tools you have to understand and analyse the trustworthiness of you are enterprise information. With integrated data profiling and metadata management functionality, the solution provides continuous insight into the quality of you are data. Giving you the power to improve the effectiveness of you are operational, analytical, and governance initiatives.

Data profiling - Improve information trustworthiness and reduce the risk of propagating bad data.

Metadata management - Consolidate, integrate, and audit your metadata from all relevant sources.

Root cause and impact analysis - Determine the origin of data quality problems and how they impact downstream processes or information assets.

Validation rule management - Define data validation rules against data sources and apply rules continuously to monitor data quality.

Creation of a metadata business glossary - Promote a common understanding and acceptance of business terms and build a central location for organizing them.

Development of cleansing packages - Create and reuse the rules, patterns, and dictionary that comprise data cleansing packages.

  • Run setup.exe

50.jpg

  • Choose Yes

51.jpg

Prerequisite check

  • The installation program checks for the required components review the results and decide whether to continue with the installation
  • Click Next to continue

52.jpg

Deployment platform message

  • Click Next to continue

53.jpg

Welcome Screen

  • Review the recommendations made by the installation program Welcome Screen

  Click Next to continue

54.jpg

License Agreement

  • Click next to continue

55.jpg

Product Registration

  • Enter the key & Click next to continue

56.jpg

Specify the Installation Path

  • Click next to continue

57.jpg

Choose Language Packages

  • If need more languages, select here
  • Click next to continue

58.jpg

 

To be continued.....

SAP Data Services with BO Tools & Information Steward Installation + Information Part3

$
0
0

Installation Type Selection

  • Choose Primary
  • Click next to continue

59.jpg

CMS Configure Information

  • Provide the CMS User details
  • Click next to continue

60.jpg

  • Choose Yes

61.jpg

  • Choose Yes

62.jpg

Select Features

  • Choose the Features that need to Install
  • Click next to continue

63.jpg

a.png

Specify Repository Database Type

  • Choose Repository Database Type
  • Click next to continue

b.png

Specify JBDC Driver

  • Browse the JDBC Driver
  • Click next to continue

64.jpg

Repository Connection Information

  • Provide Machine Name, Port Number
  • User name and Password.
  • Click next to continue

65.jpg

Repository options

  • Choose create Repository
  • Click next to continue

66.jpg

  • Click Yes to continue

67.jpg

Start Installation

  • IS installation in Started
  • Click Next to Continue

68.jpg

  • IS installation going on

c.png

69.JPG

 

Installation finished successfully..

 

Related Notes:

 

560499 - Global Support Customer Interaction

1570523 - Information Steward Trouble Shooting Tips

1631192 - Information Steward Installation is missing services under the EIMAdaptiveProcessingServer

Creation of Repository, Datastore and Replication of ECC Data using BODS with SQL DB Part II

$
0
0

Hello All,

 

This is continuous for my First Part

 

Creation of Repository, Datastore and Replication of ECC Data using BODS with SQL DB Part I

 

In this blog i am going to explain


1) Repository Configuration

2) Data store Creation

3) Replication of SAP ECC data.


1) Process of Configuring Repository :


Login to Central Management Console

 

36.PNG

 

Choose Data Services

 

37.PNG

 

Goto Manage tab --> Choose Configure Repository

 

38.PNG

 

Provide the DS_REPO details

39.PNG

The DS_REPO has been confiugured in CMC

40.PNG

Right click and choose User Security

42.PNG

Choose Administrator

43.PNG

 

Choose required Access Levels and select “ > ” symbol

44.PNG

 

Choose Advanced tab

advanced tab.PNG

Click on Add/Remove Rights

45.PNG

Assign required authorizations as per your requirement

46.PNG

Choose Data Services Repository, assign required Authorizations and Choose Apply and click on OK

47.PNG

Now we will be able to view the Repository and login to Data Services

41.PNG

 

2) Process of Data Store Creation :


After logging into Data Services Choose Create Datastore

48.PNG

 

Here provide the details of SAP ECC system

49.PNG

But here in the down i have faced an issue related to Job Server. To resolve that need to create Job server

50.PNG

Login to Data Services Server Manager and Choose configuration Editor

51.PNG

Choose Add

52.PNG

Provide the required details and choose OK

53.PNG

The Job Server has been created

54.PNG

Able to view the Job Server.

55.PNG

Choose Access Server tab

56.PNG

Create an empty folder for BODS_SHARE and browse that path

57.PNG

Choose Close and restart

close and restart.PNG

Data server job is done, now in the down tab we are able to see Job Server.

 

Next Choose Create Datastore

58.PNG

Provide the ECC details and BODS_SHARE path

59.PNG

The Data Store SAP_EH6 has been created.


3) Replication of SAP ECC data :

 

To Import the tables from EH6 --> Right click on Table -->Choose Import by Name

60.PNG

Give any table name, Eg. Mara

61.PNG

The table has been imported successfully.

62.PNG

 

Hope this is Helpful...

 

Thanks..

BODS Start-up Activities for beginners

$
0
0

Installation of IPS and DS:

 

Please refer below links for both IPS and DS Installation.

 

IPS Installation:

http://scn.sap.com/community/data-services/blog/2012/11/27/information-platform-services-ips-installation

DS Installation:

http://scn.sap.com/community/data-services/blog/2012/11/27/data-services-40-sp1-installation

 

Once done with Both IPS and DS installation, we need to follow below steps to create the repository and user with appropriate access.

 

  1. Repository creation

 

Before creating the repository we need to get the Database details with full or admin access to user on particular DB

 

Once we get the Database details please navigate to SAP Business Objects Data Services folder from All Programs and select Data Services Repository Manager.

 

bods.png

And provide the below required details in fields and click on create.(For local repository)

 

 

bods.png

 

Once repository is created we need to assign Job server to repository.

 

2.Assigning Job Server to Repository

 

                                Select Data services Server Manager from All Programs

 

bods.png

bods.png

Click on Configuration Editor

 

bods.png

click on add

 

bods.png

Give name for the fields Job Server name and leave Port number as it is (Those are default)

 

 

Once we give the Job server name then select Add button to assign the repository database details to Job Server

 

bods.png

 

And provide DB details in all fields, click on apply and ok.

 

 

3.Registering the repository in CMC with DB details

 

Select Information platform services Central Management from All Programs

 

bods.png

Then you will be navigate to Web console link

 

bods.png

 

Login into the CMC with Admin credentials

 

And select Data services from drop down list

 

bods.png

And click on below highlighted box

 

bods.png

And enter all details repository name and DB Details and click on test connection then save then repository will be registered in CMC

 

 

   4.User creation and permissions

 

Select Information platform services Central Management from All Programs

 

bods.png

 

And select Users and Groups from drop down list

 

 

bods.png

 

 

And click on below highlighted one

 

bods.png

Enter all the details and click on create and close

 

 

bods.png

 

Once user is created provide him related access (Designer (Read/Write)/Console)

 

 

Right click on particular user and click on Join Group

 

 

bods.png

 

And select data services designer (Designer window access),monitor(Management console monitor access) and operator access(Execution access from management console) from drop down list

 

 

bods.png

 

bods.png

 

And click on OK

 

Now user is having read access to repositories from designer.

 

If we want to give full access to this user for particular repository, please follow below steps.

 

Select data services from drop down list and select particular repository for which you want to give full access to user

 

Right click on repository and select user security option

 

bods.png

Next click on add principals

 

bods.png

 

Select the user from list for which you want to provide full access to this user

 

bods.png

 

 

bods.png

Click on Add and assign security

 

bods.png

Select full control and click on apply.

 

 

5.Once we complete all above steps try to login to designer to cross check.

 

bods.png

 

bods.png

Cheers

 

Sagar T


Installation of SAP HANA CLIENT, ODBC & DESIGN STUDIO

$
0
0

Hi All, The topic of my Blogging activities in SDN will be initially on Installation of SAP HANA CLIENT, ODBC & DESIGN STUDIO.

 

TOOLS INFORMATION:

HANA CLIENT TOOL

ODBC

DESIGN STUDIO 

 

PREREQUISITES:

The following components need to be installed before starting the installation:

  • Microsoft Office 2007
  • Adobe Flash 10.1 or higher
  • SQL Server Native Drivers.
  • Hana client tools.

 

HANA CLIENT TOOL

The SAP HANA client software can be installed on a variety of platforms.

The following platform types are supported:

● AIX

● HP-UX

● Linux

● Microsoft Windows

● Solaris


HANA Client is the piece of software which enables you to connect any other entity, including Non-Native applications to a HANA server. This "other" entity can be, say, an NW Application Server, an IIS server..etc. The HANA Client installation also provides JDBC, ODBC drivers. This enables applications written in .Net, Java..etc to connect to a HANA server, and use the server as a remote database. So, consider client as the primary connection enabler to HANA server.

  • SAP HANA Client are the "client tools" like ODBC Drivers for windows to connect to the HANA Database when you want to use for example excel to connect to SAP HANA.
  • The SAP HANA Studio is the Eclipse based development Environment for SAP HANA.

Download the media from service market place, Need SMP credentials

 

1.png

 

When you install the SAP HANA client software package, supported clients are installed and available. The clients available on Microsoft Windows platforms are as follows:

● SQLDBC

● ODBO

● ODBC

● JDBC

● Microsoft ADO.NET


Run the setup file with run as administrator rights, follow the wizard

 

3.png


Confirm the standard options and go through the short installation procedure:


4.PNG

Click on Install


5.png

 

After installing SAP HANA Client and SAP HANA Studio, we can start HANA Studio by navigating to the Windows Start menu, go to "All Programs type SAP HANA and hit "SAP HANA Studio":

 

Installation completed successfully.


Default Installation Paths for Windows

Microsoft Windows x86, 64-bit 64-bit C:\Program Files\sap \hdbclient


For more info, please check SAP guides.


http://help.sap.com/saphelp_hanaplatform/helpdata/en/c6/ce24c3bb571014bcfdf6abefff1d4f/frameset.htm


1577128 - Supported clients for SAP HANA


ODBC Connection to SAP HANA

After the HANA Client is installed, if not already created, you will need to create an ODBC Connection to HANA.

 

What is ODBC?

ODBC (Open Database Connectivity) is a standard programming language middle ware API for accessing database management systems (DBMS). The designers of ODBC aimed to make it independent of database systems and operating systems.

 

6.png

The figure above shows how ODBC fits in between an application and the database it is accessing.


From the Start menu → select Control Panel →Select Administrative Tools

7.png

Select Data Sources (ODBC)

8.png

Select the System DSN tab → Select Add → Select HDBODBC for 64 bit machines → Select HDBODC32 for 32 bit machines → Select Finish.

9.png

Click on finish tab.

 

Complete the entries Data Source Name → HANA_ODBC (suggested) Description • HANA ODBC (suggested) Server Port → Select Connect

10.png

Server name should be filled in for you to test your connection, leave the User and Password field blank! Your AD/MC credentials will transmit automatically

When you receive the Connect successful! Message • Click OK.

11.png


DESIGN STUDIO MEDIA DOWNLOAD AND INSTALLATION PROCESS:

System Requirements:

Before installing the design tool of SAP Business Objects Design Studio, make sure that the following components are installed on the local machines of your application designers:

● Internet Explorer 9.0, 10 or 11

● For 64-bit version of design tool installer: 64-bit version of Windows

● For 64-bit version of design tool installer: 64-bit Secure Login Client of SAP NetWeaver Single-Sign On .

SAP BusinessObjects Design Studio enables application designers to create analysis applications and dashboards – based on SAP NetWeaver BW, SAP HANA and universe data sources – for browsers and mobile devices (iPads, for example).


SAP BusinessObjects Design Studio can be used locally and integrated in the following platforms:


  • SAP BusinessObjects Business Intelligence (BI platform)
  • SAP NetWeaver
  • SAP HANA

 

Download the media from service market place by providing the credentials.

Click on DESIGN STUDIO CLIENT 1.5 and download the required media.

12.png

 

 

INSTALLATION PROCESS:

Run the setup file with run as administrator rights,

14.png

Click on next tab

15.png

 

Click on next tab, Installation process started…

16.png

Installation completed successfully.


The default path for 32-bit is C:\Program Files (x86)\SAP BusinessObjects\Design Studio and for 64-bit the path is C:\Program Files\SAP BusinessObjects\Design Studio.


Connecting Design Studio

17.png

 

Go to Application--> New

18.png

19.png


Provide the necessary information and choose finish.

Click on Add Data Source Tab

20.png

Click on Browse tab, automatically our Hana db system will appears as we have configured ODBC server, choose DBH.

21.png

 

Click on OK tab, provide the Hana database credentials,

22.png

 

Now connection is active,

23.png

 

Click on ok.

Able to view tables successfully.

24.png

 

That's it, in my next blog will explain about SAP LUMIRA and Analysis office.

 

Thank you for reading the blog.

Rajesh K.


SAP Data Services Volume Testing

$
0
0

Recently I was playing around with Data Services to capture statistics by processing big volume of data. I know Data Services is very mature and capable of handling massive volume but wanted to see it for myself under various scenarios as my previous experience reminded me that we used Data Services to transfer ~2 billion records from Oracle to HANA which took less than 10 days with complex transformations.

 

The fact is that Data Services took ~10 mins to transfer 40 Million records from MySQL to SQL Server. I downloaded IMDB interface files and processed it initially using python to decode the data into SQL statements. The generated SQL file was then loaded into MySQL which was around 5GB in size. The result of import was 60 tables with 45 million records into MySQL database. I also had SQL Server instance, so decided to transfer from MySQL to test the performance.I used 3 data flows in series to transfer all 50 tables from MySQL to SQL Server and the total job execution time taken was close to 10 mins which was very quick (Of course there were no complex transforms except for fewer joins). The machine used to perform this activity was configured with 16GB ram and 4 cores.

 

My next exercise would be to push these data into HANA and build a complex calculation view to get some reports out of IMDB data.

SAP Data Services File Handling Functions

$
0
0

Over the time, I have worked with various ways to read a file, check if file exists and so on. Here, I would like to share what I have learned thus far which someone might find useful. Rest, please ignore if you already know the below.

 

Assuming DS is installed on windows server, if you are using Linux replace path of cmd to Linux shell.

 

Check File Exists

 

Method 1

 

ltrim_blanks(word_ext(exec('cmd','dir "[$G_Input_Filepath]" /o /b',8),1,':')) = '0'

 

1) You basically use the dos directory command with switch /o /b so that it returns list output.

2) Split the string using word_ext to check if file exists.

3) The output of exec if file not found will be as below

Capture.PNG

4) If file is found,

Capture.PNG

Here is the complete script if you want to test

 

$path  = 'D:\Test.txt';
print(exec('cmd','dir "[$path]" /o /b',8));
print(word_ext(exec('cmd','dir "[$path]" /o /b',8),1,':'));
print(ltrim_blanks(word_ext(exec('cmd','dir "[$path]" /o /b',8),1,':')));

Method 2

 

$Flag  = exec('c:\\windows\\system32\\cmd.exe', '/c '||$path,2);

If file is found it returns,

Capture.PNG

You could just check if $Flag is not null to make sure the file exists.

 

If file is not found, you get the below output,

Capture.PNG

The advantage of above 2 methods is that you can use wildcards while with file_exists() you can't.

 

file_exists("filename.txt") = 1 #works
file_exists("Filename.???") = 1 # does not


Method 3


You could also use

wait_for_file($Path,0, 1000) = 1

This function also supports wildcard file names and UNC paths.



Enforce File Name Pattern

 

Sometimes you might want to enforce certain pattern in file name while reading in data services to avoid reading the wrong files.

For this purpose, you can use placeholders while defining the file names. I normally use a job control table where I define the file path and file name for every file that will be read.

 

Assume the filename entry or variable is set to "Open Orders - ????-??.xls"

 

When this is passed to the any of the above methods, it looks for any file that matches the pattern defined.

So the job will pickup "Open Orders - 2015-04.xls" and will ignore if it has "Open Orders - 2015-4.xls" or "Open Orders - 201504.xls"

This way you can enforce only certain file name patterns to be read by the job.

SAP Application Advanced Configuration

$
0
0

HI All

 

Anyone familiar with SAP documentation on SAP Application Datastore Advanced configuration.

 

I have setup Load Balancing on Data Services v4.2 SP5, and used a Server Group, which is setup on the ECC system. When executing the job, it seems to populate the "executing server" information in ECC System Job overview (SM37), but TargetServr is not populated.

 

The only way I can see to populate the TargetServr is to use a server in the Target Host field. If I use a ECC server Group, then the job fails with INVALID_TARGET. Anyone have prior experience using SAP application datastore with Load Balancing, with Server Groups created in ECC Systems.

BODS 4.2 SP06 - Information Platform Services Download Missing?

$
0
0

Hi

 

The BODS 4.2 SP06 release notes state that Information Platform Services (IPS) 4.1 SP6 or SP7 is required.

 

I can only see IPS downloads for up to SP5 available?

 

Has anybody seen IPS SP6 or SP7 available?

SAP Data Services builds momentum with BigQuery

$
0
0

SAP Data Services builds momentum with BigQuery

Thanks to its key benefits like low startup costs and fast deployment time, there is no doubt about why Cloud-based analytics like Google BigQuery is rapidly gaining popularity. However, this does not mean that companies will completely abandon their on premise data centers due to security concerns and other factors. For this reason, many companies have chosen the hybrid approach to implement their big data analytics solution which requires bi-directional ETL capabilities to move and transform data among on premise and in the cloud applications. With the upcoming release in November 2015, you can rely on SAP Data Services to do just that.


Support native Google BigQuery data store since DS 4.2 SP4

SAP Data Services features a rich set of out-of-the-box transformations, with over 80 built-in functions including native text data processing, data masking, and data quality features that allows users to prepare only the relevant and trusted information before loading into BigQuery tables. The software supports the JSON data format, thus you can use the same designer UI to create dataflow that defines the process for loading data in flat structure or with nested/repeated fields into BigQuery in a drag-and-drop manner.


For example, using a Data Quality transform to improve and load a JSON document contains multilevel hierarchical-structure data into Google BigQuery can be accomplished with just few simple steps. Below is a diagram to illustrate how to create a dataflow in DS to flatten data, perform required transformations, create hierarchical data as needed and load it into BigQuery for analytics.



BQ_DF.png


Diagram 1 – an example to cleanse and load a multilevel hierarchical data from a JSON document to Google BigQuery 


Enhance SQL Transform to read data from BigQuery data store in DS 4.2 SP6

You can write any BigQuery SQL statements such as selections, projections, joins, etc. directly in the SQL Transform for any complex data retrieval operations. By doing it this way, the software will pushdown all BigQuery SQL statements to the database layer. As a result, all queries are executed by the native Big Query analytics engine giving optimal performance even if you are working with complex data from multiple tables that contain deeply nested structures.


For example, suppose you want to find the number of children each person has in personsData.json, you can use SQL transform to aggregate across children and repeated fields within records and nested fields.

SQL_Transform.png


After you click the “Update schema” button, Data Services will automatically populate the output schema which obtains column information returned by the select statement.


Schema.png

And you will get the following result:

Results.png


As you can see below, the results are the same when run through the BigQuery Web UI.

Save Flat-Files along with current Timestamp(FileName+Timestamp) using Global Variable

$
0
0

Hi,

 

I had a requirement to store Flat Files dynamically along with the current time-stamp attached with filename. When I searched through, didn't find any specific material for the same. So now, As I've implemented it, I'de like to share the process with you

 

  1. Create a Flat File Format provide the location in Root directory where you want to store the file. Leave file name empty here.
  2. Now in your job - Create a global variable e.g. $G_File_Name.  Then take a script before the DF and write the below code in Script - 
    ‘$G_File_Name = YourFileName_'||TO_CHAR(sysdate(),'MMDDYYYY')||'_'||TO_CHAR(sysdate(),'HHMISS')||'.'||'txt';’
    Note: You can change the file extension as per
    requirement. e.g. '.csv' or '.txt'
  3. Assign this global variable ‘$G_File_Name’ as file name to flatfile format in your dataflow where you are using the flat file as Target.
  4. So every time you run the job it will create a file with Filename along with the current timestamp attached to it.
    e.g. In our case it will generate a file as ’YourFileName_11302015_021745.txt’.

 

Thanks,

Gokul


Number of days between two dates excluding weekends

$
0
0

Here's the code that can be used within a custom function to find the number of days between two dates excluding weekends.

It uses Data Services's built-in-function day_in_week().


$L_Start and $L_End are the two input dates

 

$L_Diff =  day_in_week($L_End) - day_in_week($L_Start) ; 

$L_Output = ((date_diff(  $L_End,$L_Start,'D')- $L_Diff) / 7 * 5)  + (decode(($L_Diff <5) ,$L_Diff,5))-

decode ( day_in_week($L_End)-4> 0 , day_in_week($L_End)-4 ,0)%5

return($L_Output);

 

Binary to decimal conversion

$
0
0

Here's a piece of code which takes the binary number as input and convert them into a decimal output

$l_input is varchar and rest of the variables are of type int.

 

 

#$l_input = '100111101111'; (=2543)
$l_len = length($l_input);
$l_lastDigit = 0;
$l_sum = 0;
$l_i =0;
while ( $l_i < $l_len)
begin      $l_lastDigit =substr($l_input,length($l_input),1) ;      $l_sum = $l_sum + $l_lastDigit * power(2,$l_i);      $l_input = substr($l_input,1,length($l_input)-1 ) ;      $l_i = $l_i + 1;
end
print ($l_sum);

Delete files older than N days

$
0
0

Hi All,

 

Simplifying the way - How to delete the N days older files with bods(Windows Platform).

 

Take a notepad and write below code

 

REM Remove files backup older than 7 days
forfiles /p "C:\backup\folder" /s /m *.* /c "cmd /c Del @path" /d -7


Where,

/p <Path> : Specifies the path from which to start the search.

/s = search sub-directories

/m = search mask

/c = set a command

 

 

This code will delete files older than 7 days, if you want to delete 15 days older file write 15 instead of 7

 

Then call the bat file from bods script

 

exec('cmd', 'C:\delete_older_File.bat', 8);


Regards,

Gokul

MDG Material duplicate check to be done using multiple ECC systems

$
0
0

Hi Gurus,

 

We are implementing SAP MDG 8.0 and currently have 4 SAP ECC system and an MDM system from which we will extract data using BODS. As part of our data model we intend to bring only cleansed data into MDG. Based on our requirement when a user is trying to create a Material in MDG and clicks on check button. We would like to check for duplicates material in all four ECC systems (Non Cleansed Materials) to be taken into consideration. I have read of SAP provided exit that can be implemented during this check however wanted to check if DQM can be used for such custom search. Appreciate if anyone can suggest the best approach to follow.

 

Thank you.

SAP Application Advanced Configuration

$
0
0

HI All

 

Anyone familiar with SAP documentation on SAP Application Datastore Advanced configuration.

 

I have setup Load Balancing on Data Services v4.2 SP5, and used a Server Group, which is setup on the ECC system. When executing the job, it seems to populate the "executing server" information in ECC System Job overview (SM37), but TargetServr is not populated.

 

The only way I can see to populate the TargetServr is to use a server in the Target Host field. If I use a ECC server Group, then the job fails with INVALID_TARGET. Anyone have prior experience using SAP application datastore with Load Balancing, with Server Groups created in ECC Systems.

Viewing all 236 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>