Quantcast
Channel: SCN : Discussion List - Data Services and Data Quality
Viewing all 4237 articles
Browse latest View live

Data Services Scheduler is not working properly.

$
0
0

Hi Experts,

 

I am using Data services 4.2 and i have around 6 jobs scheduled under windows scheduler to run all days in a week. I found a strange issues, some times job is not triggering once or twice in a week and its running in all other days, Kindly help me what is causing this strange issue.

 

 

Thanks & Regards,

Subbu.


Can anyone please help on issue in SAP Data Services 4.1

$
0
0

Hi All,

 

     I have Upsert logic in my job where Table Comparison and Map operation components are used and I applied Transaction Control on some target tables.But When I ran full load, the job hangs at the table where Transaction control is applied.

Transaction control works fine for Incremental load but not for full load

 

Reason for applying Transaction Control:  When we do multiple dry runs for Incremental load, job fails with the error " the target table is deadlocked".

So after using Transaction Control, the job ran successfully for incremental load.

 

 

 

Please help me to resolve how this transaction control works for  both full load and incremental load. and If any other solution without using transaction control is available to execute job successfully for both Full load and incremental load????

 

Appreciate your help as its production issue which needs to be resolved asap....

 

 

 

 

Thanks in Advance

Sandhya A

Loading data having large number of columns

$
0
0

Hi,

 

I have a table having more than 700 columns and 1000000 rows. It is a transactional table. While doing the Initial load though BODS, to Sybase IQ, it took about 72 hours ,to load just 84000 records. Is there a workaround to get this loaded faster?

 

Thank you,

Priyanka

System cannot find the path specified error

$
0
0

Hi,

 

I am trying to copy files from one folder to another folder within the same shared drive. Getting error "cannot find the path specified". Please see my error log.

 

1319612532PRINTFN7/25/2015 12:30:25 PM\\vmspfsfsch02\JET_INTEL_TO_BL\*
1319612532PRINTFN7/25/2015 12:30:25 PM\\vmspfsfsch02\JET_INTEL_TO_BL\Processed\*
1319612532PRINTFN7/25/2015 12:30:25 PM1: The system cannot find the path specified.
1319612532JOB7/25/2015 12:30:25 PMJob <JOB_JET0010_PreLoad> is completed successfully.

 

 

Script is as follows

 

print ( $GV_Jet_Base_Folder );

print( $GV_Jet_Processed_Folder );

print(exec('cmd','copy "'||$GV_Jet_Base_Folder||'" "'||$GV_Jet_Processed_Folder||' - copied to Processed Folder on '||to_char(sysdate(),'mm-dd-yyyy')|| ' at '   ||to_char(sysdate(),'hh-mi-ss')|| '.txt'||   '"',8));

 

Even if I change my global variables to the path on the job server, same error message.

 

Can anybody help?

 

Thx
RM

wait function

$
0
0

Hi All,

 

Please provide me inputs, Here is my scenario.

I came up with  wait_for_file( $GV_FILE_PATH || '/' || $GV_FILE_TYPE || '*' || '.TXT',0,0,1,$GV_FILE_NAME);  to check the path and find the file to process. Finally assign it to $GV_FILE_NAME in one of the pre-processing script.

 

But the $GV_FILE_NAME is capturing the complete path where I don't it because I am assigning this value to source Flat file's file name(s).


Note : $GV_FILE_PATH  is defined as global variable with path as value.


If run the job ,it will throw error because path is already defined and wait_for_file function is pulling the complete path.


Cannot open file </test/Ecommerce/IBMMainFrame/Acknowledge_Files/test/Ecommerce/IBMMainFrame/Acknowledge_Files/POV_Confirm_1.TXT>.



All I need is pull just the file name from the wait_for_file.


Please advice.


File_Path_Name.PNG

Dynamic extract from right

$
0
0

Hi All,

 

Please share any function or idea to extract file name from right left.From the below path, File name will change dynamically ( using wait_for_file function to get the file name but it is returning complete path )  and I need to extract only the file name which is before "/" from right.

 

 

$GV_FILE_NAME /test/eRW/MainFrame/Acknowledge_Files/Correct_Confirm_1.TXT

BODS 4.2 Installation: Windows or Linux?

$
0
0

Hello,

 

We are planning an fresh installation of BODS 4.2 SP4, using IPS 4.1 SP4. We have used both Windows and Linux here and I would like to know which OS other administrators prefer and why.

 

Thank you,

 

Andrew

Experts kindly help me how to solve the below issue.

$
0
0

Hello Experts,


I have a job with more than 25 dataflows, BODS prod system, when I'm excuting the job it is showing the below error. But I couldn't able to find the attribute name 'AIColumn1' or "AICOLUMN1" in the whole job. I have spent two days to search for this  field/attribute in every dataflow (more than 25 dataflows),


When I read the errorlog in DS Management console it is showing some error on procedure. I checked the HANA system for the procedure, but unfortunately I couldn't able to find the any procedures in the HANA System and the attribute name "AICOLUMN1".


Error:

Attribute name "AIColumn1"  different from Attribute name: "AICOLUMN1" : line 35 col 1 (at pos 5800)>. The SQL submitted is....



Please find the attachment of errorlog.


Experts kindly help me how to solve this issue.




Thank You,

Venkat.


SAP Data Services 4.2 Sizing

$
0
0

We are in the process of Sizing SAP Data Services 4.2 server. I couldn't find any sizing tool that can help in sizing DS server.

Within the SAP Data Services 4.2 Product Availability Matrix, there is a recommended minimum sizing for a single physical Machine installation of Data Services.

Minimum Hardware Requirements

  • 4 processors (or 2 dual core processors) with minimum 2 GHz recommended
  • 16-18 GB RAM Recommended

 

Disk Space Requirements (not including Operating System)

  • 20 GB for default installation with English language only installed
  • 23 GB for default installation with all languages

 

Trying to understand how SAP defined Minimum Hardware requirement and what were the criteria used for defining this.

Can SAP Mentors advise:

  • What scale installation did the above applied to?
  • What were the benchmarks considered when sizing?
  • What consideration was made in regards to the number of Jobs that the server could run?
  • How many jobs were running in parallel and how large were these jobs?
  • What size of the Source and Target repository was considered when sizing?
  • What Type of Source / Destination Database was considered (e.g.  MS SQL Server 2012 R2, Oracle etc)
  • What degree of complexity were data transformations?
  • What degree of complexity with Referential Integrity and Data Cleansing?
  • Who did SAP configure the Services across the recommended 4 CPU’s.

 

Thanks & Regards,

Prateek

Recovery/Deleting data which was loaded during a Failed Job

$
0
0

I've a job, which is loading about 100 million records!. If the job fails for some reason how to delete the data which was loaded for the failed job?. How to add a recovery mechanism to the job? Do you have any idea?

What version of SAP Replication Server (SRS) is supported in SAP Data Services 4.2 SP03

$
0
0

Hi Everyone

 

I have looked that the Data Services PAM updated May 28th 2014 for 4.2 sp01 and sp02 and I only see support for SRS 15.7.1 sp100 which is pretty old, is this still the only current version of SRS supported?   Is there another link to a newer PAM for Data Services?

 

Thanks

Terry Penna

DQM address validation issue in ECC

$
0
0

Hello,

 

We have installed SAP DQM 4.0 SP5 components as per the installation guide.  When we are trying to perform address validations using XD02 Tcode in SAP ECC system the address validations are not working correctly. When I am trying to perform address validations using DQ real time jobs it is working correctly in Data Services.

 

I have performed all ECC configurations for DQM and I am able to see request and response calls in DQ side but the address validations are not working correctly.

 

Please see the below screenshot which has incorrect city name and postcode.The correct address is :: 10 Allcroft Rd,Birmingham, West Midlands B11 3EB. When I click check address button in ECC it has to show correct city name and postcode under validated address section.

 

 

Address_Validation_Popup.png

 

Please help me on this.

 

Many thanks in advance.

 

Regards,

Subrah

Difference Between Table Comparison and Map Operation Transform

$
0
0

Hi all,

 

 

Can any body give what exactly diff  b/w Table comparison and Map operation Transform with one scenario.

 

Suppose we have more than one duplicate key in Table comparision Transform,, In what basis that key is processed.

 

 

Thanks

 

Narasimha

How to GenerateSuggestion List???

$
0
0

Hello Friends, Hope you'll are doing well...

 

For the first time, I am implementing a demo job for Address Suggestion List in an IDES system (without SAP DQM installed) using Global Suggestion List transform. I have few queries, can anybody please help me out :

 

1. As I do not have any directories installed Can I still perform this demo on any of the default data files? if yes, where should I configure in the path of the transform in Global address or anywhere else?

 

2. Can I use a reference file as .csv(which will contain sample 50 addresses)as directory, where the addresses will be looked up for reference, if yes, where do I need to pass the path of that file and what all options should I configure in the transform.

 

I have already looked in the ds_reference_4_2 for configuring the options in the Global Suggestion list transform, but have serious confusion on Reference file path and Directory path for the Global address(Do we need to provide path in these both attributes), Can anyone give me clarity on this?

 

Guys, I am new to this, If am wrong please correct me.

 

Regards,

Santosh

DQM Initial Match code error /FLDQ/RSADRINIBM

$
0
0

Dear all,

 

the /FLDQ/RSADRINIBM report fails with T061 error (communication error)


The RFC server log file (flrfcsrv_log.txt) says that the RFC server is operational:


23.07.2015 16:37:28 Status: All required SAP system ID *** Data Services system localhost jobs are active

RFC host: ***

DS host: localhost

SAP system ID: ***

SAP program ID FLRFCBTC

 

 

***CHANGED*** Data Services job "Service_Realtime_DQ_SAP_Create_Break_Keys" is running - state = SERVICE_STARTED

***CHANGED*** Data Services job "Service_Realtime_Batch_DQ_SAP_Name_And_Address_Match" is running - state = SERVICE_STARTED



but when calling the Get_potential_batch_matches some error happens.


23.07.2015 16:41:54 Error: Unable to start Data Services batch match process. Error message = "Error running batch job."

23.07.2015 16:41:54 Technical Error #898: An error occurred processing the GET_POTENTIAL_BATCH_MATCHES request

23.07.2015 16:41:54 Error: Unable to start Data Services batch match process. Error message = "Error running batch job."

23.07.2015 16:41:54 Error: SAP system *** terminated its connection with the RFC Server ***. Gateway server terminated or currently experiencing a network issue

 

The Real time service  'Service_Realtime_Batch_DQ_SAP_Name_And_Address_Match' is running properly.

The Enable Validation idea given in  note 1511841 - Batch Match job failing - DQM for SAP is unchecked by default.

 

Any help is welcome.

 

Thanks in advance


SAP DQM services :drive is filling up with logging.

$
0
0

Hi ,

 

We are using SAP DQM 4.0 and BODS 4.2 SP3. For one of the DQM services the windows services RFC logs gets filled at higher rate  (FLRFCSRV.log) . Eventhough we have set the enable traces option as no , the drive is getting filled up?

 

Kindly let me know is there any possible way to resolve this? Also let me know if any details to be shared.

 

Regards,

Karthik Rajan P

XML mapping issues - how to get the structure of the target in the mapping output?

$
0
0

Hi Data Services guru's,

 

I am creating a dataflow which reads an xml file, transforms it, and subsequently writes it as a new XML file.

The source and target are different (the target having more fields, which should be populated by default values or fixed values).

 

Both source and targets have 1000s of fields and 100s of nested structures. (They originate from SAP CRM).

 

My problem is that I can't find a way to get the new fields / nested structures in the output schema of my transformation.

I can manually add them, but that is a lot of work.

When I try to copy and paste fields (from the target object to the transformation, they are added, but not necessarily in the right order.

 

How can I easily get the structure of my target xml format in the target section of my transformation?

 

Thanks for your help in advance.

 

Jan. 

Using the Execute Preloaded Option for ABAP Dataflows in SAP BODS

$
0
0


Hello All,

 

This is regarding the use of Advanced Option in SAP Application Type (ECC) Datastore settings in SAP BODS 4.2, where there are two options to select from for ABAP Execution Option property: (1)Generate and Execute and (2) Execute Preloaded. Since our ECC client is often locked by BASIS team even on DEV environment, we would like to make use of the second option 'Execute Preloaded' so that we could extract the data from ECC tables without having to ask the BASIS team to unlock the ECC client every time before extraction.

 

The problem is that we are getting an error upon generating and uploading the ABAP program to ECC client. I have searched the blogs and so far I have only found that there are certain ABAP programs or function modules that come with SAP BODS which need to be installed by BASIS team on ECC side to allow the ABAP dataflows to be generated and uploaded to ECC server. I would appreciate it if anyone could provide a list of which function modules BASIS needs to install on ECC or a blog that provides details around using this option.

 

So far, from the SAP BODS designer, we are performing below steps but getting an error upon generating and uploading the ABAP dataflow program:

 

1. Create a test ABAP Dataflow using SAP ECC datastore. Provide the ABAP program options.

2. Right click, select Generate ABAP Program.

3. Once the ABAP Program Generation Dialog box appears, check the box "Upload Generated Program".

4. Upon clicking OK, we are getting the following error:

 

 

The ABAP program <ZRTEST01> for ABAP data flow <RT_TEST_R3> (datastore <R3_DS>) was not uploaded: < RFC CallReceive error <Function /BODS/RFC_ABAP_INSTALL_AND_RUN: RFC_ABAP_MESSAGE- Exception condition "NOT_SUPPORTED_BY_GUI" trigger[SAP NWRFC 720][SAP Partner ### ][clientname][servername][accountname][4103]>. >.

 

Any help would be greatly appreciated.

 

Thanks,

 

Rizwan

 

BLOB, CLOB and RAW data types in BODS

$
0
0

Dear Experts

 

We are moving data from Oracle to HANA data base using Data Services 4.2 SP2 with bulk load option

things are not working fine when source has combination of BLOB & CLOB.

I'm using blob as a data type in DS for oracle blob

and long as datatype in DS for CLOB in oracle

and varchar for RAW in Oracle.

 

After running the job (used only query transform to map columns) the trace log shows number of records which are matching to number of records in source table, in all the threads but target load never never get's completed , waited patiently for hours but no luck.

 

1. Doesn't HANA bulk loader support these data types?

2. If i don't use bulk load option , DS loads only one record to the target

 

Kindly suggest the solution , how to treat these types in DS.

 

Many Thanks

Ramakrishna

Cannot import extractor internal error occurred (BODI-1116153)

$
0
0

Hi All

 

I am getting the below error when I try to import a custom extractor in BODS

 

 

Cannot import extractor <9AZDP_ST_BW> internal error <Extractor: RDPS_REPL_SOURCES_GET_LIST failed> occurred (BODI-1116153)

 

 

 

 

I checked with our BASIS team and he said that SAP Note 1585204 cannot be implemented. We also checked the SE38 transaction with the help of ABAP person and tried to expose the extractor using RODPS_OS_EXPOSE but its not working. Our Basis person says that no note needs to be implemented.

 

Also referred Error on imorting Extractor from SAP System into BOBJ Data Service Designer

 

Regards

Arun Sasi

Viewing all 4237 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>