Quantcast
Channel: SCN : Discussion List - Data Services and Data Quality
Viewing all 4237 articles
Browse latest View live

usage / dependencies in data services

$
0
0

Hi experts,

 

We are experiencing some problems with dependencies in data services.

usage says 0 althoug a Dataflow is used, or maybe a workflow says that is used 17 times when only is used once. I coulnd't find how to replicate it, it just happend randomly.

 

a) close and enter in the Application does not resolve this

b) refresh object library does not resolve the usage column

b) there are differents ways to fix it depends of the case,

   17 times used : when use "where used" function it shows the proper value

     0 times used object: I had to delete the object and add it again in the parent object.

 

0 times used object is the matter that scary me a lot, because when try to export ATL file, the object is not proposed.

 

the version used is SAP BO DS 4.2 SP6, and the only sap note that seems to describe a similar issue, does not match my version

  2314782 - The Usage column for a table in a Datastore shows 1 when it should be 0 in SAP Data Services

 

All help provided will be apreciated.

 

thanks in advance


Data Services Query

$
0
0

Hi,

 

Does anyone know around here how to flag an address in Data Services 4.2 as a personal vs a business address?

 

Thanks!

Erik.

Same BODS job executes randomly Successfully / Throws error

$
0
0

Hello Experts

 

We have created a BODS job with ABAP DATA FLOW which executes a ABAP program in ISU system and created a '.dat' files in the ISU staging directory, This directory is Mounted on data service Job server.

 

Both ISU and BODS are Unix Servers.

 

So, the issue is when we execute the BODS job sometimes it run successfully without any error but some times it throws below error.

 

15640 4222756640 R3C-150605 5/20/2016 11:25:02 AM |Data flow DF_MR_ISU_LOAD_EPREIH

15640 4222756640 R3C-150605 5/20/2016 11:25:02 AM The SAP job was canceled for host < >, job name <JOB_MR_EPREIH_FULL 05/20/16 11:24:52>, job count <11245200>, job log from SAP

15640 4222756640 R3C-150605 5/20/2016 11:25:02 AM <20160520, 112452, Job started

15640 4222756640 R3C-150605 5/20/2016 11:25:02 AM 20160520, 112452, Step 001 started (program Z_MR_EPREIH_FULL, variant &0000000000041, user ID SAP_BODS)

15640 4222756640 R3C-150605 5/20/2016 11:25:02 AM 20160520, 112452, Error at OPEN '/MR_ABAP_FILES/BODS_ABAP_FILES/EPREIH_FULL.DAT' (check file)

15640 4222756640 R3C-150605 5/20/2016 11:25:02 AM 20160520, 112452, Job cancelled after system exception ERROR_MESSAGE

15640 4222756640 R3C-150605 5/20/2016 11:25:02 AM >.

15627 392468256 R3C-150605 5/20/2016 11:25:02 AM |Data flow DF_MR_ISU_LOAD_EPREIH

15627 392468256 R3C-150605 5/20/2016 11:25:02 AM The SAP job was canceled for host < >, job name <JOB_MR_EPREIH_FULL 05/20/16 11:24:52>, job count <11245200>, job log from SAP

15627 392468256 R3C-150605 5/20/2016 11:25:02 AM <20160520, 112452, Job started

15627 392468256 R3C-150605 5/20/2016 11:25:02 AM 20160520, 112452, Step 001 started (program Z_MR_EPREIH_FULL, variant &0000000000041, user ID SAP_BODS)

15627 392468256 R3C-150605 5/20/2016 11:25:02 AM 20160520, 112452, Error at OPEN '/MR_ABAP_FILES/BODS_ABAP_FILES/EPREIH_FULL.DAT' (check file)

15627 392468256 R3C-150605 5/20/2016 11:25:02 AM 20160520, 112452, Job cancelled after system exception ERROR_MESSAGE

15627 392468256 R3C-150605 5/20/2016 11:25:02 AM >.

 

I have executed the same job 16 times and the  random outcomes are as below

 

ExecutionsResult
1Successful
2Error
3Error
4Successful
5Successful
6Error
7Successful
8Error
9Successful
10Successful
11Successful
12Error
13Error
14Error
15Error
16Successful

how to use lookup_ext function in script

$
0
0

hi friends,

could  any one please explain about this.

my goal is lookup a column value in lookup table and only those values are allowed in my validation transform

How to apply Validation transform?

$
0
0

Hi All,

 

How to apply validation for below rules?

 

Table Data:

 

Capture.PNG

EMPNODEPTNOP1P2P31P32P41P42P51P52P6P7P81P82P9
100110Y____________
100220NY/N
100330YY/N
100410N_
100520YY/N
100630N_
100710YY/N
100820N_
100930Y/N
101030Y/N
101110YY/N
101220N
101330Y/N

Validation rules are the following:

 

Any record fails meeting validation rules will flag as error.

 

1.P1 is either Y or N

2.If P1 is Y  ==>  remaining all are empty as shown in screen.

3.If P1 is N  ==>  P2,P31,P41,P51,P6,P7,P81,P9 are needed.

4.If P31 is N  ==>   P32 should be empty.

5.If P31 is Y  ==>   P32 either Y or N.

6.If P41 is N  ==>  P42 must be empty.

7.If P41 is Y  ==>  P42 either Y or N.

8.If P51 is N  ==>  then P52 must be empty

9.If P51 is Y  ==>  then P52 either Y or N.

10.If P81 is N  ==>  then P82 must be empty

11.If P81 is Y  ==>  then P82 either Y or N.

 

How to write validation?

 

Thanks in advance.

How to split DataServices

$
0
0

Hi experts,

 

We have a data services installed on an existing BIP 4.1 environment.

Now we want to split DS part from the existing BIP, and rebuild it on a separate server based on IPS due to it's recommended by sap.

Does anyone know how to proceed it? Where can I find out the needful infomation?

After the DS split, how can we remove all of the DS related data, objects from the current BIP system?

 

I'm really appreciate your comments and ideas.

Thanks in advance.

 

Kind regards,

Li

Datastore configuration used by jobs

$
0
0

Hi,

 

Is there anyway to idenfify the number of jobs which are using a particular datastore or system configuration?

 

Actually the problem is that there is one user for which we dont have the password and if we reset the pasword then it may create issue for other jobs which are using this user.

 

How can we identify the jobs which are using that user/system configuration.

 

If that is used by one job only then we can reset it.

 

Please confirm.

 

Thanks.

 

Best regards,

Akhilesh

Leading Zeros Trimmed while importing from SAP

$
0
0

Dear Experts,

 

We are trying to do migration of Data from SAP ECC to SAP HANA using SAP Data Services. But the challenge is while migrating we are comparing the source and target and see challenge with trimming of leading zeros. SAP has leading zeros but when we view data from DS it trims those and moves to staging area. Is there any configuration we could do at Data store for keeping the leading zeros?

 

Sample Data

 

SAP:

FieldX

001

002

003

 

In DS

FieldX

1

2

3

 

Expected Output:

FieldX

001

002

003

 

TIA

Santhosh


SQL SCRIPT Get Latest TimeStamp (datetime)

$
0
0

Hi im having difficulty in getting the latest timestamp for my job. Max function wont work neither TOP 1. i need this in order to get only the latest data using timestamp.

 

$G_LATEST_TIMESTAMP = sql( 'DS_STG_SOURCE', 'SELECT MAX DI_TIMESTAMP FROM STGPG.STG_EXTRACT_2 ORDER BY DI_TIMESTAMP DESC');

 

 

print($G_LATEST_TIMESTAMP);

Data Migration Via Excel & ABAP

$
0
0

Dear Experts,

 

We have a scenario where we will migrate the ERP data which is on MS Navision ( Microsoft ERP). Considering there is less number of records for key data objects like customer, vendor, material, GL like 10K ~ each. there are some open transactions as well as typical AP, AR etc.

Data quality is Ok, mostly there would be one to one mapping. Plan is to get the extracts in excels, validate there itself with some queries and enriched excel will be loaded to SAP HANA via ABAP programs. Obviously there would be Value mappings etc. will be maintained separately.

 

Anyone came across this kind of scenario? Please let me know if you see any challenge here.

 

Many thanks in advance.

 

 

Rgds

Deep

Transformation equivalent to 'Blocking Step' in Pentaho

$
0
0

Hi Experts,

I just wanted to confirm if there's any function/transform in BODS which performs the same/equivalent functionality to 'Blocking Step' in Pentaho tool.

 

Here's what 'Blocking step' does :

The Blocking step blocks all output until the very last row is received from the previous step.

At that point, the last row is sent to the next step or the complete input is sent off to the next step. Use the Blocking Step for triggering plugins, stored procedures, Java scripts, ... or for synchronization purposes.

 

Dirk Venken : How can we achieve this functionality?

 

Thanks in advance.

Is there any way to take backup of Data service Repository and user metadata details from CMC?

$
0
0

Hello Experts,

 

Is there any way to take backup of entire Repository and user meta data details from CMC?

 

As we are going to upgrade our BODS Version from 4.0 to 4.2 after that we need to upgrade all our repositories(Count more than 200 repositories) to 4.2 and re-sync of job server.

 

Instead of taking the repository metadata details from CMC for each repository, Is there any other way to track all these details in excel(Backup of CMC)??

 

 

Thanks,

Sagar T

Data Transfer Transform error

$
0
0

Hello,

 

This is regarding the issue in the Data Transfer transform and getting below error:

 

DFC-250014 Data flow did not receive registration requests from its children within 360 seconds

Could anyone suggest why this error popping up and job fails?

 

Regards,

Yogesh Verma

First 10 records from flat file using windows batch script

$
0
0

Hi Folks,

I need a windows batch script for

 

How to First 10 records from flat file using windows batch script

 

How to process recent file in the windows folder using batch script.

 

Thanks,

Naveen

al_jobserver.exe high memory consumption

$
0
0

Hi,

 

We have observed occational high memory consumption of al_jobserver.exe. Following are the details,

 

DS version: 4.2 SP4

 

Landscape: distributed landscape, 4 job server machines

 

Each job server has 16 GB RAM

 

Memory spike occationally happens in any of the one jobserver out of the available 4 job severs.

 

No heavy jobs running during that time.

 

While al_jobserver.exe memory was around 12 GB, total memory consumed by al_engine.exe didnot crossed 2 GB (around 5 al_engine.exe were running).

 

Kindly let me know if you have faced similar issue of memory spike in al_jobserver.exe, how you have debugged / resolved.

 

Regards,

Ravi


Confusion with data display in the Min,Max,Median columns(under value) in data Profiling in SAP BOIS

$
0
0

Hi All

 

      I was trying out the "Column profiling" option in the SAP-BOIS tool. There is a field called "LANDX" with the unique country name values as shown below:-

 

Sample values under the LANDX column:-

Landx column.png

The Min, Max,Median value displayed by the Infosteward tool:

 

infodoubt3.png

I am confused Why the value "Germany" is displayed as low value, why "Libya" the median,and "Zimbabwe" as the high?I mean with other numeric value fields the logic was straightforward, but i am confused in case of such string values. Please help me in clarifying the confusion

Confusion with the "Value" display under Distribution while profiling in SAP BOIS?

$
0
0

Hi All

      I am a newbie in the world of Data Governance(Profiling). I was trying out the profiling option  with some sample data in the SAP Information Steward tool. But I do have a doubt in the value displayed under the "Value" (in Distribution tab/Option) in the Profiling results as shown below in the following image(highlighted in the right):-

 

My "Adrs" field contains three digits numeric code values(no text field) and there are no data having 999Then why '999' is displayed there?

Please help me in understanding the concept

infosteward.png

SIA Error Failed to Decrypt object 100

$
0
0

Hi Folks,

 

I am new to Data Services and trying to pick it up. We have a Setup of Data Services done by a contractor. We have a distributed setup of the BOBJ and Data Services components separate from the SQL Server installation.

 

We started facing problems after the SQL Server user password expired. The contractor did not set it up a Non Expiring. I remember the password which was set so I changed the user to non Expiring and gave the same password. Also the ODBC connection from the BOBJ server to SQL works fine but when the SIA agent logs give error. " Failed to retrieve cluster name from the database. Reason : Failed to decrypt object 100 using cryptographic ID: 4. '

 

I get the same error when I open the Configuration tab. I am trying to use the Specify option but there it asks me for a cluster key which I don't know.

 

Any help.

 

BTW : This is not a clustered installation, we have only one node.

 

regards

Yogesh

Does RFC Transfer method require any directory access?

$
0
0

Hi,

 

I am setting up an RFC Transfer method currently on BODS 4.2.5, with ABAP data flows, but unfortunately it fails saying there is no access to a certain path. I was under the impression that RFC Transfer method did not require any directory access. Is my understanding correct? And has anyone set up RFC Transfer Methods with ABAP Data flows?

how can i add a new field/column on bods permanent table?

$
0
0

good day.

 

i am supporting a project, and in this project we are extracting data from SAP ECC, we are using SAP BODS to extract and we are loading to SQL-server, and on top of the server, we have built a universe with single connection, to create web intelligence  reports.

 

now we have a requirement to add a new field, so we created the field in ECC and write the logic to populate it to the extractor, now the field is available on the extractor 2LIS_13_VDITM in ECC.

 

on BODS on the source datastore we just used re-import to include this field on a source datastore in BODS. now we have to pass this field to a target table.(the table is a permanent table) from my understanding it is impossible to add a field manually on a permanent table. somebody suggested us to delete the table on the database and create another one with same name, and manually map it, then import to make it a permanent table, (we do not have access to the database and  production system and the client do not want to delete the table on the database)

 

my question is, how can we add this field on this target table, without deleting the table on the database?

Viewing all 4237 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>