Quantcast
Channel: SCN : Discussion List - Data Services and Data Quality
Viewing all 4237 articles
Browse latest View live

BODS 4.2 and datastores SQL Server 2012 migration - central repo problem

$
0
0

I was wondering whether you have encountered this before:

 

Setup:

  • BODS 4.2
  • SQL Server 2012 (datastores only)
  • Repos on SQL Server 2008 (as per PAM support)

 

We upgraded from 3.1 to 4.2 on Data services at same time as upgrading the staging and target DBs to SQL Server 2012.

 

This has gone ok, and developers local repositories can adjust their datastore configurations to make the datastores SQL Server 2012 and to run code.

 

The only problem is when we get to checking in/out of central repository or getting latest. Even though I have checked out inc dependents on datastores and done the config flip to 2012, when you check in only the parent datastore checks in changes and the table metadata doesn’t change – hence not checked in (as this is the default way central repo works).

 

This means when developers then check out/get latest, they overwrite their local repo definitions of tables with 2008 and cannot run anything. They then have to do the config flip to 2008 and then back to 2012 to correct this error.

 

Is there a way to force check in Central repo table metadata/definitions without any physical changes to schema etc.? Is this stored at DF grain? – as it does list all DFs effected by the config flip.

 

One suggestion was to enable compatibility at DB level to 2008, but that’s a last resort really. Only other option I can see is checking all the code out and making small changes – like descriptions on tables to then force them to check in as 2012 – we have a lot of code and this is something we don’t really want to entertain.

 

This is the error message once getting from central with datastore config set as SQL Server 2012:

 

6460 5400 RUN-050413       20/03/2014 16:38:29        |SESSION JF_TEST_JOB_Live_Insight_CRM|WORKFLOW WF_Initialise_DM_CUSTOMER|DATAFLOW DF_Load_LKP_CUSTOMER_CODES

6460 5400 RUN-050413       20/03/2014 16:38:29        None of the configurations for the specified target table <LI_Extract.DW_STAGE.LKP_CUSTOMER_CODES> contain a database match

6460 5400 RUN-050413       20/03/2014 16:38:29        <Microsoft SQL Server 2012> for the current datastore <LI_Extract>.  For optimal results, always migrate data flows with

6460 5400 RUN-050413       20/03/2014 16:38:29        associated datastore information. NOTE: If you migrated data flows without associated datastore information, import the

6460 5400 RUN-050413       20/03/2014 16:38:29        necessary datastore from the source and add a new datastore configuration to match the new environment. When adding a new

6460 5400 RUN-050413       20/03/2014 16:38:29        datastore configuration, set the "Use values from" option under "Values for table targets and SQL transforms" in the "Create

6460 5400 RUN-050413       20/03/2014 16:38:29        New Configuration" window to use the configuration from which you migrated. You can make the new datastore configuration your

6460 5400 RUN-050413       20/03/2014 16:38:29        default or use system configuration to instruct Data Services to use the new datastore configuration. For more migration

6460 5400 RUN-050413       20/03/2014 16:38:29        information, see the %4 Advanced Development and Migration Guide.


BODI-300056 error when migrating repository from 3.2 to 4.2

$
0
0

I just installed IPS 4.1 sp2 and Data Services 4.2 successfully. Somehow, when I tried to migrate repository using Repository Manager, I got the captioned error. The message was indicating that a primary key was already in existance and therefore, can not be recreated so that the local repository upgrade was cancelled. I am attaching the detail message below...

 


Real-Time service running but not able to setup webservice

$
0
0

Hi, I have successfully configured a real-time service.

In the admin console, there is a green tick in the box and the status is 'service is started'.

I would now like to expose this service as a webservice, so I go to:

Web Services

Web Service Configuration

Select 'Add realtime service...' and click 'apply'

The access server is displayed, but the realtime service does not show here.

 

I have configured only one access server.

It is not SSL enabled.

I am running Data Services 4.2 on W2k

 

Any suggestions?

 

Thanks,

 

Jan.

Where is the .bat command when exporting an execution command?

$
0
0

From the administrator console, I generate an execution command.

I get a confirmation that the export was successful and a password file has been generated on:

 

"on Job Server GBxxxxxx:3500, at C:/ProgramData/SAP BusinessObjects/Data Services/conf/Repo_GBLONBO11.txt."

 

I have no problem locating the password file, but where is the .bat file?

 

I have done a search on the entire server for jobname.bat file but nothing to be found.

I have logged on to the server using the same username as the user which runs the Data Services service.

 

In some other posts I found a reference to C:/ProgramData/SAP BusinessObjects/Data Services/common

but that folder does not exist.

 

We're using Data Services 4.2 on W2k.

 

Thanks for your help,

 

Jan.

SSO setup and configuration for sap business objects 4

$
0
0

According to the business intelligence platform administrator guide, SAP BO provides the SSO.

In order to authenticate the user login from other systems in the external, do I need to work a how to use the SSO?

And without SSO, What do you do in order to authenticate the user login?

 

 

CMC, Web Intelligence: BI PLATFORM 4.1 SP2

Dashboard: DASHBOARDS 4.1 SP2

Explorer: EXPLORER 4.1 SP2

Data services: SBOP DATA SERVICES 4.2 SP01

 

 

Please opinion. Thanks.

How to load the Source Table data into target IDOC XML format (DEBMAS06) ?

$
0
0

Hi ,

 

I have a requirement as below and kindly help me out how to do it step by step.

 

I have a source Table which is having 30 columns and this table data need to be loaded into IDOC XML format.

The Target IDOC (Basic type is - DEBMAS06) should be in XML format.

 

Please explain me step by step solution as I am new to working on IDOCs.


Regards,

Vijay

Does the SQL Transform get pushed down to the HANA database

$
0
0

We have a project that has a need to create a full outer join.  We know how to do that but are weighing whether writing the full outer join into a SQL Transform is a better option performance wise.  The two tables are on HANA.  Would the SQL within the SQL transform get pushed down to HANA to run or are we better off doing left and right outer joins and then merging and removing duplicates as shown in the attached?

How to create "Full Outer Join" in SAP BODS

Jobs which are using AL_ENGINE.EXE PIDs

$
0
0

Hi Experts,

 

Need your help to fine the Job which are currently running in the DS servers.

 

Below are environment Details:

 

Windows 2008, 32 B RAM. BI 4.0 SP6 +  DS 4.1 SP2 in the same server.

 

I have 3 repositories in the server. sometimes server becoming irresponsive.  AL_ENGINE.EXE are consuming 100% CPU utilization.

 

I would like to find the jobs which are jobs currently running and finding related AL_ENGINE.EXE in the Task manager.

 

Please find the below screenshot. is there any way to find the jobs and relative PIDs of AL_ENGINE.EXE?

 

This will help me to find the jobs which are consuming more CPU and Memory.

 

 

Lo.JPG


I would like to set job trigger on Data Service job.

$
0
0

Hi Expert,

 

I would like to know that how to set job trigger on Data Service job because now our system has been set job separate for get data from FI Module , HR Module and other. I want to execute job follow below.

 

Start --> FI job run --> FI job finished --> trigger --> HR job run --> HR job finished --> trigger --> Other etc.

 

Or if you have any idea. Please advise me.

 

Thank you for your advise.

The repository TESTDS1 cannot be found

$
0
0

Hi,

 

We have copied every thing from one repository to  TESTDS1. This repo is there and we are able to login to this repo using SOAP UI.

 

But we are getting below error when we try to call the Import_Repo_Object

 

  <?xml version="1.0" encoding="utf-8" ?>

   <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">

   <soapenv:Body>

   <localtypes:RepoOperationResponse xmlns:localtypes="http://www.businessobjects.com/DataServices/ServerX.xsd">

   <returnCode>1</returnCode>

   <errorMessage>The repository TESTDS1 cannot be found.</errorMessage>

   </localtypes:RepoOperationResponse>

   </soapenv:Body>

   </soapenv:Envelope>

 

Any thoughts on why this issue would appear?

 

Thanks,

Uma

Create XML file from table data

$
0
0

Dear All,

 

 

with dataservice 4.0, I want to create an XML file from a table data.

Table have a single column but more record, for example:

0001000488;100;EUR;

0001000489;200;EUR;

0001000450;300;EUR;

 

 

My desired XML output:

<Data>

  0001000488;100;GBP;

  0001000489;200;EUR;

  0001000450;300;EUR;

</Data>

 

 

I try with a sample query but the sistem write only the last record in XML file:

<Data>

  0001000450;300;EUR;

</Data>

 

 

Can everyone help me?

 

 

Thank in advance.

Simone

MS SQL as source and Target

$
0
0

I have MS SQL as datastore.

 

I am not able to use it as either Target (with ECC as source) or as source (for HANA as target). I can preview data in DS Designer so I guess it is some issue with Jobserver. I have read 2 opinion on this issue, the first one speaks about using ODBC as type but another one mentions some setting. I guess I will try ODBC as type but if you know of a setting which can take care of the issue, appreciate if you can share it.

 

The error I have is.

 

 

 

------------------------

14.1) 04-04-12 08:04:49 (E) (52013:3645134592) REP-100102: |Data flow New_DataFlow5

                                                            Unsupported database type <Microsoft_SQL_Server> specified as repository. Valid types are <Oracle, ODBC, MySQL, Sybase, Memory,

                                                            MaxDB, HANA, DB2>. Ensure database type matches in spelling and case to the supported types.

(14.1) 04-04-12 08:04:49 (E) (52009:3013994240) REP-100102: |Data flow New_DataFlow5

                                                            Unsupported database type <Microsoft_SQL_Server> specified as repository. Valid types are <Oracle, ODBC, MySQL, Sybase, Memory,

                                                            MaxDB, HANA, DB2>. Ensure database type matches in spelling and case to the supported types.

 

----------------------------

Monitor Entries are as below

------------------

 

14.1) 04-04-12 08:04:32 (52009:3013994240)      JOB: The initial environment locale <eng_us.utf-8> has been coerced to <Unicode (UTF-16)> ().

(14.1) 04-04-12 08:04:33 (52009:3013994240)      JOB: Reading job <876188c9_7c92_4369_a10e_082aaa5fe63a> from the repository; Server version is <14.1.0.340>; Repository version is

                                                      <14.1.0.0000>.

(14.1) 04-04-12 08:04:33 (52009:3013994240)      JOB: Current directory of job <876188c9_7c92_4369_a10e_082aaa5fe63a> is </build/boe/dataservices/bin>.

(14.1) 04-04-12 08:04:33 (52009:3013994240)      JOB: Starting job on job server host <hanasvr-08>, port <3500>.

(14.1) 04-04-12 08:04:33 (52009:3013994240)      JOB: Job <New_Job5> of runid <20120404080433520093013994240> is initiated by user <boeuser>.

(14.1) 04-04-12 08:04:33 (52009:3013994240)      JOB: Processing job <New_Job5>.

(14.1) 04-04-12 08:04:33 (52009:3013994240)      JOB: Optimizing job <New_Job5>.

(14.1) 04-04-12 08:04:33 (52009:3013994240)      JOB: Job <New_Job5> is started.

(14.1) 04-04-12 08:04:34 (52013:3645134592) DATAFLOW: Process to execute data flow <New_DataFlow5> is started.

(14.1) 04-04-12 08:04:34 (52013:3645134592)      JOB: Initializing transcoder for datastore <GTLECC> to transcode between engine codepage<Unicode (UTF-16)> and datastore codepage

                                                      <<DEFAULT>>

(14.1) 04-04-12 08:04:34 (52013:3645134592)      JOB: Initializing transcoder for datastore <GLPNOP> to transcode between engine codepage<Unicode (UTF-16)> and datastore codepage

                                                      <<DEFAULT>>

(14.1) 04-04-12 08:04:48 (52013:3645134592) DATAFLOW: Data flow <New_DataFlow5> is started.

(14.1) 04-04-12 08:04:49 (52013:3645134592) DATAFLOW: Cache statistics determined that data flow <New_DataFlow5> uses <0> caches with a total size of <0> bytes, which is less than

                                                      (or equal to) the virtual memory <3757047808> bytes available for caches. Data flow will use IN MEMORY cache type.

(14.1) 04-04-12 08:04:49 (52013:3645134592) DATAFLOW: Data flow <New_DataFlow5> using IN MEMORY Cache.

(14.1) 04-04-12 08:04:49 (52013:3645134592) DATAFLOW: Data flow <New_DataFlow5> is terminated due to error <100102>.

(14.1) 04-04-12 08:04:49 (52013:3645134592) DATAFLOW: Process to execute data flow <New_DataFlow5> is completed.

(14.1) 04-04-12 08:04:49 (52009:3013994240)      JOB: Job <New_Job5> is terminated due to error <100102>.

Issue with Lookup_Ext when possible match is NULL

$
0
0

I have stumbled upon a quirk with lookup_ext which is causing me a bit of greif. I need to do a very complex lookup and hit a roadblock with the last column compare. In this column it is possible to have a null value or a real value and this is where lookup_ext fails to match the records. Here is the example which is a simplified version of what I am doing, ;

 

Transaction Table

 

Record IDMatch_AMatch_BMatch_C
1GlovesLeatherBrown
2GlovesDrivingBrown
3GlovesLeatherNULL
4GlovesDrivingNULL
5GlovesNULLBrown
6GlovesNULLBlack

 

 

Look-up Table

 

Match_AMatch_BMatch_CSeasonStock_Ind
GlovesLeatherBrownWinterY
GlovesLeatherBlackWinterN
GlovesDrivingBrownSummerN
GlovesDrivingBlackSummerY
GlovesLeatherNULLWinterY
GlovesDrivingNULLSummerY
GlovesNULLBrownAny SeasonN
GlovesNULLBlackAny SeasonN
NULLLeatherBrownUnknownY
... and etc.

 

I am using the Lookup_ext wizard as I have to return multiple columns, however when I run the routine it always fails to match where Null = Null. I am thinking that lookup_ext wants to only match Null when you specify Is Null but in this case i can't do that. Here is the output that the above tables in the Match Routine would give me.

 

Record IDMatch_AMatch_BMatch_CSeasonStock_Ind
1GlovesLeatherBrownWinterY
2GlovesDrivingBrownSummerY
3GlovesLeatherNULLNULLNULL
4GlovesDrivingNULLNULLNULL
5GlovesNULLBrownNULLNULL
6GlovesNULLBlackNULLNULL

Unable to find SAP DS Administrator Pane in DS Management Console

$
0
0

Hi BODS Artifacts

 

I am Unable to find SAP DS Administrator Pane in DS Management Console

 

can any one help to get the Administrator pane

 

Capture.JPG

 

Thanks in advance

 

Regards

Srinivas

Error in BW import objects

$
0
0

Hello,

 

We're trying to import into Quality BW environments the required objects to store the information generated by SAP Data Service information.

In tx RSA1 the source system for DS has been created and check works fine, but when we import the transport order following error is showed:

 

Error_BW.jpg

Could you help us?

 

Thanks,

Àlex


Cannot view the job log (java.io.IOException)

$
0
0

hello,

 

I cannot view serveral job log in the Data Service Management Console.

In the Trace Job Log, there is an error "java.io.IOException".

But I can view the another log with the same Repository and job server.

The Trace file and the Monitor file of the job log are both in the directory D:\localapp\apps\SAP BusinessObjects\Data Services\log\JobServer_1\....

 

 

Thanks for your help

Xin

Best UI? - SAP BODS

$
0
0

Hi Experts..

I need a guidence regarding the best UI which can be used with SAP BODS.

For eg: Suppose i have a an interface and an user can input two values a and b.

The sum (a+b) and difference(a-b) should caluculate in BODS and the output should be displayed in the UI with the correct values.

Which is the best one i can use  with BODS.I have only small functions to be created inside SAP BODS. Like Sum, Avg, Diff, Conversion etc...

Please suggest.

Regards,

Dhanya

RFC CONNECTION ERROR WHILE CONNECTING SAP BODS TO ECC

$
0
0

Hi

 

 

can any one help me I am facing RFC connection error

I created and configured sap bods to SAP ECC using program ID

 

error.jpg

 

thanks in advance

 

 

regards

Ganga

BODS 4.2 SP1 & RFC configuration

$
0
0

Hello, I've installed IPS 4.1 SP2 and BODS 4.2 SP1 on a Windows Server 2008 R2.

 

The installations finished without problem, I configured a RFC Server Interface but when I look in the RFC Server Interface Status instead to see the interface created, I can only see:

 

RFC not available.jpg

I've checked in the CMC and the Service: EIMAdaptiveProcessingServer seems configured correctly.

These are the Available Services:

APS services.jpg

If I select Data Services RFC Server Service, I can see:

DS APS RFC.jpg

What's wrong?

 

Thanks

Matching alphabet for a particular field

$
0
0

Hi

  I am trying to use SAP BODS Query to filter a table with a field XYZ which has 'any alphabet'

 

Field XYZ is varchar(9)

 

I tried   match_pattern(upper(XYZ), 'X')  but I am afraid it will filter the field with only single character entries like 'a' or 'c'.

 

How can I filter also if multicharacter alphabets are present like 'abc' or 'cbz'

 

I am using BODS version 14.1.1.210

 

Cheers

 


Viewing all 4237 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>