Quantcast
Channel: SCN : Discussion List - Data Services and Data Quality
Viewing all 4237 articles
Browse latest View live

Cannot import jobs after first importing

$
0
0

Hi,

We transfer jobs via atl file to qas DS, but now we cannot import second, fird... jobs into QAS repository via atl file.

First jobs imported fine, but second and another jobs cannot import, Data Service Designer is freezes, desission is one - reboot DS server.

After reboot, firtst (may be new) job imported fine, but others doesn't imports. I can't understand why.

Data Services 4.0, sp3 (Win)


Oracle stored procedure as source in BODS

$
0
0

Hello Experts,

 

I am working on project where all the source tables are in store procedure (pl/sql scripts) and I will have to exec the procedure to get the data. Is it possible to populate BODS table in such way? Can you provide me with how-to information, please.

 

Tha data flow should be like this:

 

stored procedure -> query transform -> target table

SAP Data Services 4.2 SP03 is not compatible with SAP IQ 16.0 SP08

$
0
0

Hello,

 

I have a serious problems and errors while bulk loading data into SAP IQ 16.0 SP08 via SAP Data Services 4.2 SP03.

 

I have created a Data Store (Sybase IQ 16.X type) inside SAP Data Services 4.2 SP03 for my SAP IQ 16.0 SP08 database. Then I have created a sample Data Flow with SAP IQ table as data target. When I bulk load data into this table (with default Target table bulk load options - BINARY format is checked) I get an ACCESS_VIOLATION error. I found somewhere that this a known bug in SAP IQ 16.0 SP08 version. For that reason I have installed all 3 patches for that version but it didn't solved the problem. By the way in SAP IQ 16.0 SP01 version Bulk Load in BINARY format works correctly.

 

Then I have unchecked the BINARY format (this means that the Bulk Load will work in ASCII format) and executed the Job. ACCESS_VIOLATION error disappeared but another error have been displayed - Sybase IQ Syntax error near 'BLOCK'. I have discovered that this is due to the incorrectly formed SAP Data Services Bulk Load statement. The fact is that SAP Data Services includes BLOCK SIZE operator in Bulk Load statement when BLOCK SIZE part was discontinued and not supported since IQ 15.2 version. By the way SAP Data Services creates correct Bulk Load statement for IQ 16.0 SP01 version.

 

The workaround could be to delete (make empty) Block size (bytes) field in SAP Data Services IQ Target table Bulk Loader Options window. But the problem is that this field is not editable anymore and there is no way to change or empty it's value. By the way SAP Data Services 4.2 SP01 have the same problems with SAP IQ 16.0 SP08 as 4.2 SP03 version but Block size (bytes) field in 4.2 SP01 version is editable and you can apply a workaround for this problem.

 

SAP IQ 16.0 SP08 is not listed in SAP Data Services 4.2 SP01-03 PAM and all my problems probably are due to this incompatibility (and it's very strange because IQ 16.0 SP08 was release in June and SAP Data Services 4.2 SP03 just in previous or this week). But I must use SAP IQ 16.0 SP08 because this version solved another critical IQ crashing problem and I must use SAP Data Services 4.2 SP03 because it introduced a must feature for me - Data Masking.

 

Is there any way to use both those versions together? Is there any way to make them work correctly together?

 

All ideas and suggestions are more than welcome.

Thank you.

Donatas Budrys

Connection to Dynamics CRM Online

$
0
0

Hi all, I'm not an expert with SAP Data Services, and need some help.

Do you have any experience reading/updating data to Dynamics CRM Online using Data Services? Wich type of connection are you using (SOAP web services or oData)?

The Office365 authentication from this Data Services is working fine?

There is any way to test this type of connections or I need to create a full DEV Environment for this?

 

thank you very much for your help,

 

best regards

Default Substitution Parameter settings - Wrong Path

$
0
0

Substitution_Parameters_Default.png

 

Hi all,

 

I have done fresh installation with BODS4.2 SP1 on my windows machine.

 

When I check Default Substitution Parameter, It is showing wrong path as shown below:

 

C:/Program Files (x86)/SAP BusinessObjects/Data Services//admin/repo//DataQuality/reference_data

 


What is the reason?

Profiler server installation

$
0
0

Dear BODS experts,

 

I am trying to do a test install of the profiler server on the existing BODS(ver 4.2) system. I have created the repository, created a user and assigned the admin privileges to that user and gave it owner access to the repository but I still get this:

 

Any guidence highly appreciated.

 

Regards,

 

MD

Problem with Transportation in ECC, Files from Data Services

$
0
0

Hi.. I have a problem with transportation in one environment of ECC. The error is UNICODE in many functions in the Logs.

 

 

The parameter of Unicode in both is same and i don't know what happened

 

 

Note: I have other environment of ECC and the transportation works fine.

 

 

The logs show me that:

 

Program "/BODS/CL_EX_BADI_BODS=========CP": Syntax error in line "000000"

"The program /BODS/CL_EX_BADI_BODS=========CP is not "" support Unicode as the attributes of software

Database COMMIT executed

 

Program "/BODS/SAPLBODS": Syntax error in line "000000"

"The program /BODS/SAPLBODS is not "" support Unicode as the attributes of software

Database COMMIT executed

 

Program "/BODS/IF_EX_BADI_BODS=========IP": Syntax error in line "000000"

"The program /BODS/IF_EX_BADI_BODS=========IP is not "" support Unicode as the attributes of software

Database COMMIT executed

 

Program "/BODS/RPT_STREAM_READ_TABLE": Syntax error in line "000000"

"The program /BODS/RPT_STREAM_READ_TABLE is not "" support Unicode as the attributes of software

Database COMMIT executed

 

Screen "/BODS/RPT_STREAM_READ_TABLE" "1000" successfully generated

Database COMMIT executed

While Loop in DI

$
0
0

<p>Hi, </p><p>I need to load a set of 40 text files into SQL Server using the DI ETL process. </p><p>I created the while loop, but at the While loop editor, I put my variable and it does not work.</p><p>For example, I created my global variable for the $filename, and counter as $counter. </p><p>At the while loop editor on top, I put in ($counter < 40), then I have my necessary file for it to load</p><p>the data, but it does not load and gave no error when i did the validation.  The job will say it run sucessfully, but it loads no data into SQL table. </p><p>My question is does anyone knows the  correct syntax for the while loop in DI when you have </p><p>40 files to load? </p><p>&#160;</p>


Error Log File Path - Data Services 3.1

$
0
0

Hi

 

We have numerous jobs failing (that previously ran fine) and I suspect the BODS server. The trace logs are not giving much information and we believe they are actually incomplete and missing things. I understand an error file (error.txt ?) is written somewhere on the job server.

 

What is the file path for this error file?

Uploading long text from Excel using Data Services

$
0
0

Hi,

 

We are facing an issue when uploading long texts (longer than 255 symbols) from Excel file using Data Services.

 

The problem is that if the first 16 rows of source Excel file don't contain texts longer than 255 symbols, Data Services ODBC driver truncates all further texts in this column to 255 symbols, even if the field length is defined as varchar(2500) in Excel file format in Data Services and if the column contains longer texts in next rows.  If I put a long text in one of the first rows of source file, all further long texts are also uploaded correctly

with full length.

 

I've tested this ussue using simpliest data flow with one source file in Excel format and one target file in CSV format, the problem is at the very fist step of reading data by Excel Workbooks file format.

 

We've tried to set parameter TypeGuessRows = 0, using Microsoft recomendations (http://support2.microsoft.com/kb/189897/en-us), but we are still facing this issue. With TypeGuessRows =0, if the first 16384 rows of Excel file don't contain long texts, all texts in this column are truncated to 255 symbols.

 

We can't add dummy first row with long texts manually, because  these .xlsx files are received from external system daily and should be uploaded automatically.

 

Do you have any ideas about workaround for this issue? Is there a possibility to add dummy row by some kind of script automatically? Or convert xlsx to csv somehow?

 

Thanks,

Jeny

BODS 4.2 - BOExcelAdapter

$
0
0

I'm running into an issue with the Linux Excel adapter in which some .xls files containing blank lines at the end throw a null pointer exception.  I'm processing a directory containing about 40 .xls files and three of them contain empty lines at the end.  Of those three, one of them throws the null pointer exception pasted below.  I've tried running this with both the stock apache poi jar files as well as using the latest stable release from apache with the same results.

 

I guess I'm hoping someone can help me with one of three things.  Is there a direct fix for this issue?  Is there a workaround (some sort of pre-processor/scrubber)?  Or, is there a way to be notified which file out of the 40 are failing so that it can be manually remediated (I found the bad file by process of elimination this time)?

 

Log files, with all trace messages enbaled, are attached, but here is a paste of the error:

 

RUN-058107: |Data flow DF_Project_Budget_Init|Reader Project_Budget

                                                            Error reading from <Project_Budget>: <java.lang.NullPointerException: while trying to invoke the method

                                                            org.apache.poi.ss.usermodel.Row.getFirstCellNum() of a null object loaded from local variable 'lastRow'

                                                            at com.acta.adapter.msexceladapter.MSExcelAdapterReadTable.getTotalRows(MSExcelAdapterReadTable.java:1324)

                                                            at com.acta.adapter.msexceladapter.MSExcelAdapterReadTable.ReadAllRows(MSExcelAdapterReadTable.java:1096)

                                                            at com.acta.adapter.msexceladapter.MSExcelAdapterReadTable.readNext(MSExcelAdapterReadTable.java:1232)

                                                            at com.acta.adapter.sdk.StreamListener.handleBrokerMessage(StreamListener.java:178)

                                                            at com.acta.brokerclient.BrokerClient.handleMessage(BrokerClient.java:406)

                                                            at com.acta.brokerclient.BrokerClient.access$100(BrokerClient.java:53)

                                                            at com.acta.brokerclient.BrokerClient$MessageHandler.run(BrokerClient.java:1559)

                                                            at com.acta.brokerclient.ThreadPool$PoolThread.run(ThreadPool.java:100)

 

And here are screenshots of the source config:

sourcecfg.jpgformatcfg.jpg

Need to pass two data source name in SQL query in BODS script

$
0
0

Hi Team,

 

I have a requirement in moving data from one source to another source. My source is Oracle view and destination is SQL table. I need to insert the record from source to destination using script in BODS not query transform.

I have written code below

 

sql("sql datastore name", "insert into sqltablename (col1,col2) select col1,col2 from oracledatastorename.schemaname.tablename");

 

I am getting error "oracledatastorename.schemaname.tablename doesn't exist".

 

could you please help me to resolve this issue or is there any other technique we can use in BODS but i don't want to use Query transform.

 

 

Thanks,

Sundaram

MS Access 2013 Connection in SAP BODS4.2

$
0
0

Hi All,

 

How to create MS Access 2013 Connection in SAP BODS4.2?

 

Thanks in advance.

Cannot import jobs after first importing

$
0
0

Hi,

We transfer jobs via atl file to qas DS, but now we cannot import second, fird... jobs into QAS repository via atl file.

First jobs imported fine, but second and another jobs cannot import, Data Service Designer is freezes, desission is one - reboot DS server.

After reboot, firtst (may be new) job imported fine, but others doesn't imports. I can't understand why.

Data Services 4.0, sp3 (Win)

Trying to load excel workbook and the following error "OLE or COM processing error.Please make sure Microsoft Access DataBase Engine is properly installed:"

$
0
0

Hi There,

Does anyone come along with the following error, please guide me with this.

I have been trying to import/load Excel Workbook.

Though my BatchJob is successfully executed I get the following error message.

 

error.JPG

 

 

Although I have Microsoft Office 2010 and I tried to repair and re install the MS Access Database Engine 2010 from the following site

http://www.microsoft.com/en-us/download/details.aspx?id=13255

 

Yet I am unable to import the schema... FYI I tried loading both .xls and .xlsx files but both aren't getting loaded.

 

 

Thanks & regards

 

A Junaid


event triggering

$
0
0

Hi experts,

 

In our Data Services and Information Steward environment we have set up a Business Objects environment with Information Platform Services (4.1 support pack 4).

 

In order to automate the process, the information steward task job should start after Data Services job has finished successfully. Therefore I use the schedule event in the CMC as stated in the next article: How to schedule Information Steward task to run after Data Services job is completed? - Enterprise Information Managemen…

 

1. Data Services job extracts the data from the source application(s) and loads it in the staging repository.

2. Information Steward profiling tasks and rule validation tasks run against the staging repository.

3. Optionally there may be another Data Services job that moves the ‘Failed Data’ identified by Information Steward into a data mart to enable custom  BI reporting.

 

Problem: the Information Steward task job always starts at the same time when the Data Services job starts, so the event gets triggered before the Data Services job is completed. Is there any workaround on this issue?

 

Thanks in advance.

SAP BODS excel file open error

$
0
0

Hi All,

 

I am creating a very basic job which reads excel as an input and produces flat fle as output.

The excel is on my local machine.

but when I execute the job, I get below error.

 

12448 1620 FIL-080101 15/12/2014 14:05:15 Cannot open file <//Client/C$/311649/work/MLE - Demand Forecasting/sample data files/ProductService - ProductGroup.xls>. Check

 

Please can someone help what is wrong with my job.

 

Regards

Swati

RFC server is failing to start

$
0
0

Hi All,

 

Env: DataServices 4.2/DQM 4.0/RFC4.0 on linux x86_64, ECC6 EHp7 with latest DQM addon - FLDQ 02

 

./start_<SID>_trans  (- after install)

Business Objects an SAP Company: BusinessObjects DQ Mgmt for SAP Solutions - RFC Server 4.0

Startup status: Reading configuration settings...

Startup status: Accessing SAP client metadata...

RFC server shutting down by user request.

bodsvsdb $

 

To update DQM/RFC versions.. (on LINUX X86_64)

 

1. How to check the version of DQM installed ?

2. How to uninstall the DQM ?

3. Please provide sap note on Version compatibility matrix among components

    (Dataservice / DQM / RFC server / DS Components), to check before proceeding with installation?

 

I refered Guide dont have details on above concerns:

(SAP BusinessObjects Data Quality Management, version for SAP Solutions Document Version: 4.0 SP05 - 2014-07-17)

 

Please help ASAP and guide me with expert suggestions to setup environment..

 

Thanks

Ram

Getting error when calling Stored Procedure in script

$
0
0

Hi,

 

I am getting an error on while executing stored procedure on oracle 11g db as source by calling below in the script BODS 4.1.

 

sql( 'ds_xxx','begin schemaxxx.PAC_xxx_LOAD.PRC_xxx;end;');

sql( 'ds_xxx','begin schemaxxx.PAC_xxx_LOAD.PRC_xxx;end;');

sql( 'ds_xxx','begin schemaxxx.PAC_xxx_LOAD.PRC_xxx;end;');

 

Steps:

 

1. Imported SP into functions

2. Called stored procedure in script

 

Error:

 

1st SP is executing correctly and when it executes 2nd and 3rd SP, am getting below error.

 

Error_SP.jpg

I would appreciate your help on this.

Some datafllow missing during job execution

$
0
0

Hi,

 

Some of the dataflows are missing in monitor log while execution of the job.

but those dataflows are shown in the auto documntation and trace log of the same job.

 

Coud any one please help me to resolve this issue.

 

Regards

Asgar

Viewing all 4237 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>