Quantcast
Channel: SAP Cloud Applications Studio
Viewing all 81 articles
Browse latest View live

Multiple Solutions in one Tenant?

$
0
0

Hi All,

 

Multiple solutions in one tenant may look attractive at first, but in most cases, it is not a good idea.

 

What is a solution?

An SDK solution can be seen as an Add On. It is a full package of code, extension fields, screens, services, etc. A solution can be downloaded an uploaded on other tenants that belong to the same customer where the solution got created.

 

Solutions are independent!

An SDK solution is working in an isolated namespace. It can access core SAP objects, but it can not depend on another solution. Therefore it is impossible to read, write or access content developed in another SDK solution within an SDK solution.

 

Real world problem with multiple solutions

  • You have create a tab "Info" in the Account in Solution A. Now you would like to place a field from Solution B on it => not possible
  • You have calculated the opportunity item summary in Solution A. Now you would like to add this information in a print form designed in Solution B => not possible.
  • You called a web service in solution A. Now you need some data from the webservice in solution B => not possible
  • You created a custom business object in solution A. Now you would like to write a reference to your object in solution B => not possible
  • You created some extension fields in solution A. Now you have a MDRO developed in solution B which needs to access these fields => not possible.

 

Best Practice:

We recommend to do all enhancements for one Cloud for Customer installation in a single solution.

 

If you have two separate development tracks and if you are absolutely sure that no content in one solution will ever need to see, access, read, write, touch, reference or remotely use any content in the other solution now and in future, then you may decide to go with separate solutions.


Some Thoughts about ChangeHistory

$
0
0

Hello,

 

I noticed some confusion w.r.t the question:

 

When can / does the ChangeHistory works for a given entity?

 

Let me provide some details:

 

Custom Business Objects

 

  • With the annotation [ChangeHistory] the data of nearly any element and node can become part of the ChangeHistory
  • As transient elements do not have a persistency they can not become part of the ChangeHistory
  • For the Dependent Objects TextCollection and AttachmentFolder this annotation is not applicable
    => No ChangeHistory for the elements and nodes of the DOs

 

 

Extensions to SAP Business Objects

 

  • Extension Fields are handled by the Extension Framework (EXF)
    EXF does not support the annotation [ChangeHistory]
    => No ChangeHistory for Extension Fields

  • Extension Nodes are handled by PDI itself; (nearly) the same functionality as in Custom Business Objects is available
    • If the SAP Business Object is already supporting ChangeHistory
      • The data of any element and node can become part of the ChangeHistory
      • Except - of course - transient elements
      • For the Dependent Objects TextCollection and AttachmentFolder the annotation is not applicable
        => No ChangeHistory for the elements and nodes of the DOs
    • If the SAP Business Object is not supporting ChangeHistory this feature is also not available in general for the Extension Nodes

 

HTH,

    Horst

What's new in the Studio 1511 release.

$
0
0

In version 1511 release of sap cloud application studio, i'd like share with you following new features:

  • Automated Distribution of Solution.
  • Enable Assembly Split.
  • Node Extensions in Analytics.
  • New Context Reuse Library Functions.
  • Other enhancements

 

Ok, now let's go for them one by one.

1.     Automated Distribution of Solution.

         In the previous versions, studio supports the automated distributions of customer solutions from any tenant in the same system. Now you can

     restart it with the button "Automated Distribution" to retrigger the automatic upload and activation of the solution which unsucessful.

 

2.     Enable Assembly Split.

         As you know, there are 3 lifecycle management steps till 1508, which named "Activate", "Assemble", "Download" of the customer solution were

     combined in one "Assemble and Download".

         Now, with the 1511 release, the studio developer has extra option to execute those steps as independent steps. In the before, the one combined

     step will take long time for the execution of lager solution size. This split feature can run asyn in the backend, therefore the execution time for Activate,

     Assemble and Download shall be reduced significantly.

         To enable this new function, you need to select "Enable Assembly Split", then the new splited button "Activate", "Assemble", "Download" applied.

     In case you donot want it, you can disable via "Disable Assembly Split".

   

3.     Node Extensions in Analytics.

          Since 1508 release provided adding new nodes to the standard business objects. And with 1511 release, the studio allow us to add them into analytical

     objectss, i.e: DataSource, Key figure, report.

 

4.     New Context Reuse Library Functions.

          The 2 new Context reuse library provided with 1511 release.

          IsProductionTenant()

          To indicate if the tenant on which this absl script is getting executed is PRD tennant or not.

          Boolean.

 

          IsPatchSolution()

         To indicate if the tenant on which this absl script is getting executed is Patch solution or not.

          Boolean.

 

5.     Other enhancements.

          "LanguageIndependent" annotation: it can be used together with the TextCollection dependent object to support language indenpendent text.

       Say the user logs on with a different language then the one used during the creation of the text, the same text will be retrieved.

      [DependentObject(TextCollection)] [LanguageIndependent] node TextCollection;

       

          Typecodes for dependent objects     : A typecode is used to determine the kind of TextCollection or AttachmentFolder that is intended to be

       used in the UI.

 

          References to Customer-Specific Fileds: Before 1511, if you want to upload the customer solution that have referened customer fields, it was      requried to manually create those fields in the target tenant before deployment.

          Now, with 1511 release, those fields are collected at assembly of the solution. Those definitions are then used at the deployment to create those      customer-specific fields ate the very early stage of the deployment process.

Splitting the String

$
0
0

Hi All,

 

I hope there no key word in SDK for splitting the string. Here i am sharing my attempt for splitting the string. If any improvement required in code or logic. If any of you have any alternate thought or any other technique which already exist in SDK. please let me know.

 

I hope SDK Experts Horst Schaude, Alessandro Iannacci, @Stefan Krauth ...... has better coding or technique then this .

 

Step 1: Reuse library for splitting the string. It has two importing parameters(STRING_QU_RE -> String to be split, SPLIT_QU_RE -> Splitting Variable, Return will be collection split strings ).

Reuse_library.PNG

 

Step 2: Logic for splitting the string.

Logic.png

Step 3: Business object for testing the Reuse Library for splitting the string.

Business Object.png

Step 4: Script for executing the Reuse Library.

Script.png

Step 5: Testing the Business Object for splitting the string.

 

5.a: Input the String and Split Key

output.PNG

5.b: Click on Split String.

output1.PNG

output2.PNG

If this blog is help full. Please Share it, Like it and Comment it below .

 

Thanks,

Quddus.

Implementing Access Control on Custom Business Objects

$
0
0

How to provide access control on custom Business Object

 

Background to Access Control and Context.


Business objects developed by SAP, in Business ByDesign, can have controlled access based on the Access Control List (ACL) node data associated with. ACL is a dependent object (DO) that can be associated with business object to provide controlled access. You can restrict the access to instances of business objects, be it read or write, based on certain predefined contexts like Employee, Company, Site etc. A user is assigned access on these access contexts which controls the operations that user can perform on business object instances.


For example business object ‘PurchaseOrder’ has an access context of Company. By default the access rights are ‘Unrestricted’ and user has rights for read and write purchase orders for all companies. To control the access user can be assigned access (in Application and User Management->Business Users->Edit Access Rights->Access Restrictions) to read and/or write on purchase orders belonging to certain set of companies. This brings in restricted access to business object.


pic1.png



 

The access context in above screen shot is 1007-Company and the Access Context UUID is the company UUID. This means that this instance of purchase order belongs to company 00145EF588E602DBB8B3AC44715E0CC1 and a user who has been given access rights of ‘Read’ for this company can only read the purchase order. If the write access is also given for this company, then user can write on this purchase order also.


Access control for partner developed business objects


The same function of providing controlled access can be achieved for Partner created business objects. There are two ways of doing so in Cloud Application Studio (CAS).


  1. Link the custom BO to a BO in standard which has the access context supported required for custom BO. This can be done using the following association in custom BO definition. This association helps to use the access context of Purchase Order for the custom BO without having to integrate the custom BO with ACL.                               [RelevantForAccessControl] association For_Access_Control to PurchaseOrder
  2. Second approach is to integrate the ACL dependent object with custom BO and use the concept just like standard business objects. But this approach only supports the access context ‘1010-Employee’ as of today.

 

How to integrate ACL with custom business object


In this write-up I will detail on second approach, mentioned above, to achieve controlled access to custom business object. The following example will be implemented as a part of this explanation.


Scenario: Partner has created a business object ‘PurchaseOrder’ and wants to control the access to it based on employee who created the instance. User PGREENE is allowed to only Read the records created by TWEBBER but can Read and Write records created by himself.


1. Create a solution in SDK.

2. Add a Business Object ‘PurchaseOrder’ to your solution and activate it. Define the BO as follows: The first three elements represents the employee ID, Name and Address of employee who created the PurchaseOrder instance. The element marked with annotation [AccessControlContext(Employee)] creates an association from ROOT node to ACL Dependent object. This ACL will have access context 1010-Employee and access context UUID  is copied from attribute ‘OwnEmployeeUUID’.pic2.png

3. Add a script file (Event-AfterModify) to fill the attribute ‘OwnEmployeeUUID’ with the current employee’s UUID and then it gets passed to ACL node for controlled access.pic3.png

4. Create screens on the business object.                                                                 

5. Open .uiwocview file in UI Designer pic4.pngby double clicking on it. Under properties tab, section RBAMData set the access context on WoC View to 1010-Employee. Select the QAF and OIF screens against Assigned objects. Save and Activate.pic5.png

6. Open the OWL file in UI Designer. Under properties tab, section RBAMData select AccessControlledBusinessObject and remove the flag ‘UnrestrictedAccess’ which is set by default. This allows OWL to load only relevant data based on access rights.pic6.png

7. On OWL screen SELECT_ALL query is set by default for loading the data, but this query does not have ACL attributes. Use the FSI query ‘QueryByElements’ as default query which is created implicitly with ROOT node to load the data. You can see that this query has already included the ACL node attributes to fetch relevant data. pic7.png

8. Set the RBAMData for QAF screen similar to OWL.pic8.png

9. Assign the new WoC View to users and set the access rights as follows: For user PGREENE

    1. a. READ and WRITE access for PGREENE records
    2. b. Only READ access for TWEBBER records.pic9.pngpic10.pngPic11.pngpic12.png

                                                                                                                                                                                                                                                       

10.     Create some instances of custom BO with user PGREENE and with Tony Webber. Test on UI by loading the OWL and QAF. In OWL PGREENE should be able to see records created by PGREENE and TWEBBER only  PGREENE tries to edit the record created by Tony Webber and gets the error: “Not Authorized” . pic13.png

 

 

pic14.png

 

 

 

 

Hope this blog was helpful for implementing the access control on custom Business Objects.

How to make a UI flexible/dynamic using UI Switch in SDK/PDI/Cloud ApplicationStudio

$
0
0

Background of UI Switch and Business Roles


You have to understand the UI Switch association with Business roles for this topic. UI switch can be assigned to a section group on UI to control the properties (ReadOnly, Visible, Mandatory etc) of the attributes in the section. This UI switch is then assigned to Business role which in turn is assigned to a user in Business ByDesign system. Hence the dynamic properties on UI are applicable for the user who has been assigned with business role containing the UI switch. UI switch should be created in SDK or Cloud Application Studio.

 

pp1.png

 

Example: Hide “Employee Responsible” field on Purchase order OIF for a user

 

User PGREENE sees the Purchase Order OIF screen as shown in screen below. The Buyer responsible field is visible to him and in this exercise we will hide this field whenever the screen is loaded for PGREENE. The same way (as mentioned in the steps below) you can modify properties of other fields the same way.

 

pp2.png

 

1. Create a solution in SDK and add an extension Business object for Purchase Order 

 

pp3.png

pp4.png

 

2. Create a UI Switch.

 

pp5.png

 

3. Right click on the .xbo file in the solution and select “Enhance screen”. Select the purchase order OIF screen in the following pop-up screen. This will create a .xuicomponent file in the solution

 

pp6.png

 

4. Open the .xuicomponent file in UI designer by double clicking on it.. Select the section group where the field is located. Section group should have a stable anchor for the dynamic properties to support. Go to Extensibility explorer and select the action “Adjust Properties”. In the pop up you will see all fields of section group and set the Visible property of Buyer Responsible field to false. Assign a UI switch that was created in solution.

 

pp7.png

 

5. Save and activate the UI component. Save and activate the solution.

 

6. For testing the dynamic UI feature login to Business ByDesign  portal, assign the UI switch to a Business Role that is assigned to PGREENE. Go to Application and User Management WoC->Business users->Edit Access Rights-> Business Role Assignment. Click on a business role and navigate to Business Role OIF screen. Navigate to tab UI Switches. Click Add Row and select the UI switch from value selector that we created in step 2.

 

pp8.jpgpp9.png

pp10.png

 

7. Log off and login into business byDesign as PGREENE and open the purchase order OIF screen and you can see that the Buyer responsible field is not visible for him. But if you login as a user who is not assigned the above said UI switch then he/she shall be able to see the buyer responsible field on UI.

 

pp11.jpg

 

 

Hope this is useful to making dynamic UIs. Please note that this can be done only on SAP standard UIs and not on Custom UIs developed by partners. In custom UI only personalisation and adapation are supported if you define stable anchors for the section groups.

Creating a Master-Detail-Detail using a Custom BO in Cloud Application Studio

$
0
0

Hi,

This simple blog talks about creating a Master-Details-Detail kind of a scenario using a Custom BO. Simply put, If you need to have multiple levels of details nested within one another, you would need to create this sort of a structure. A typical example from a standard C4C scenario would be in an Opportunity, where you create an Opportunity(Master), it has details of Items, and each Item in turn can have its own Revenue Schedule

 

To do this in a custom BO, you would need to create a structure like that with nested nodes.

Below is an example that I used from another Question on SCN that was posted by Ognian Kalaydjiev, wherein:

 

The main BO was "Multiple Languages".

1 For each language there would be an instance created(Level 1 node),

2          and for each such instance, there would be certain characteristics maintained (Level 2 node.. [1,n]),

3                    for each such characteristic there would be multiple values possible (level 3 node [1,n])

4                              and for each such value there would be multiple descriptions(level 4 node [1,n])

 

import AP.Common.GDT as apCommonGDT;

businessobject MultipleLang

{

 

       [Label("Identification")] [AlternativeKey] element ID:ID;

       [Label(" Name")] element IdName:LANGUAGEINDEPENDENT_MEDIUM_Description;   

               node Characterictis[1,n]

               { 

                   [Label("Characterictis ID")] [AlternativeKey] element CharID:ID;

                            [Label("Characterictis Name")] element Name:LANGUAGEINDEPENDENT_MEDIUM_Description;

                          node Value[1,n]

                            { 

                                   [Label("Value")] element Value:LANGUAGEINDEPENDENT_MEDIUM_Description; 

 

                                   node ValueDescr[1,n]

                                   {

                                   [Label("Value Desc")] element Value:LANGUAGEINDEPENDENT_MEDIUM_Description;

                                   }

                             } 

              }

}


What we want in the UI is something like the following with multiple lists, and the values displayed in each list is dependent on the selection in the previous list(s)

2.png


To create a TI for this structure where for the main language that the TI opens for, the subsequent nodes' information is displayed on selection from the table, what is required is the following:


1. TI Data Model with bindings to respective BO elements from the BO structure

1.png


2. Check the Data Loading Propery of the Data List in the Data Model


For the first list/node with [1..n], the Data Loading property wouldnt appear

3.png

 

3. For all the lists below this level, the Data Loading Property is by default Lazy, implying that the Field is only read for the parent lead selected data. This is anyways good for performance as well

4.png

4.png

 

That's It !! Based on the data you create, you would find the Master-Detail-Detail created and displayed based on the individual selections made in the different lists

 

Cheers !!

Vinita

Dump analysis - a useful C4C Studio tool

$
0
0

Hi everybody. I decided to write this blog post because probably most of you don't know about a good feature of the Cloud Application Studio: the dump analysis.

 

Thanks to Emanuele Bucci  for the reporting

 

 

1. Do you have a dump on the UI and you don't know from where is coming? The UI is not giving you the right informations about the wrong line of code in your custom absl code?

 

scn1.png

 

2. just go in the Cloud Application Studio and chose the menu item View->Dump Analysis

 

xscn2.png

 

3. You will have the possibility to choose btw dumps of today, yesterday or all (moreover you have an advanced search and the possibility to reset the dump list)

 

xscn3.png

 

4. And finally by double clicking on the dump, you will be redirected on the absl line of code even if this code is part of a solution that you did not open yet.

xscn4.png


Query/Navigate by Association - Repository Explorer

$
0
0

Hi All,

 

I wanted to share few important features of Repository Explorer which can be helpful in ABSL coding.

 

Initially to find out which node or association where a value is saved I used to save the association or node in a local variable and check the values using debugger. Although it worked, it was time consuming and irritating due to many steps involved.

 

Then I came to know about these features in repository explorer through we can identify the correct association without having to write any code.

 

#1: Open Repository Explorer, Navigate to any Query of any Business Object, Right-click the query and Execute Query

Capture.JPG

#2: Give the input of any selection parameter and Click 'OK', it can also be multiple select options

Capture.JPG

#3: You get the Query Results, depends on the selection parameters

Capture.JPG

 

#4: You can generate the code by Generate Code option and use the same in the ABSL for Querying

 

#5: If the needed value is not available in the Query Result or the values are available in different association then we can use the Navigate by Association(Drop down) and navigate to that particular association and search for values.

Capture.JPG

#6: If the Query Result is having more than one entry then we need to select any particular value and use Navigate by Association.

 

#7: Single-Record view gives you an alternate way for viewing the records.

Analytics in SDK: Common mistakes to avoid

$
0
0

Here i would document some of the common mistakes made during editing Analytics objects from SDK:

1) Do not edit Analytical objects like Report , Key figure etc directly in the Key user tool (Business Analytics Work center) 

Edit the analytical objects like Report by launching it from SDK and not directly from key user tool i,e by logging into the Application UI and then accessing Business Analytics Work center and then editing the analytical objects, the reason is SDK won't be able to capture the changes made direcly in the key user tool. 

Ex: Login to the Application UI, select Business Analytics Work centre, select Report created from SDK , add Report View ,this is incorrect, since the change was made by logging into the Application UI , SDK won't capture the changes, hence, the changes won't be visible when we upload this solution in some other tenant 
Correct sequence is: Login in to SDK, open the solution, double click on the Report, add Report View Now, SDK would caputure the changes, and this change would be reflected in any tenant where you upload the solution

 

2) Do not use Key user tool (KUT) created extension field in Analytical objects like Joined datasource

 

Only use extension field created from SDK and not extension field created from KUT, the reason is KUT created extension field is available only locally in the current tenant only, so if you upload such a solution into other tenant activation of the object will fail , since the KUT created extension field won't be available in the target tenant

 

Ex: Created extension field in the key user tool and used it in joined datasource, uploading of this solution into another tenant fails since the key user created extension field won't be available in the target tenant

 

Thanks, Pradeep

How to Utilize Field add using Key User Tool in script.

$
0
0

Hi All,

 

This blog post is to understand how we can use the field in script for writing the logic which added using Key User Tool (KUT).

 

Step 1: With Business user add the field where you want to add. In this case I am adding it on Sales Order. Open tenant in silver light mode. Goto Adapt->Enter Adaptation mode. After Adaptation mode is ON. Click on Adapt->Edit screen. On Right side of screen .Click on Extension Fields. Select the place where you want to add your field. In this case I am adding it on header level as shown in below image.

1.1.png

 

Step 2: Select the Oder Header and click on Add screen will appear as shown in below image. Select the field type: Text and enter the Field Label and Click on Save.

2.png

2.1.png

Step 3:  Select the field which you have created in the above step as shown in below image and then Save and Publish it.

3.png

After completing all the above steps. Field will visible as shown below image.

4.png

Step 4: If we want to utilize the KUT field in script we need to create Reference to customer-specific field.


Note:

  • You cannot reference a customer-specific field in a solution template.
  • You cannot reference a calculated field.
  • You can only reference a customer-specific field in the extension of a business object that has been released in the public solution model (PSM).

5.png

Step 5: When we create Reference to Customer-Specific field. Table will appear with all the extend field. Select  field which we have created in above step 1.

6.png

Step 6: Now Extend the business object where you we have add the field. In this case it is Customer Quote. Save and Activate the Extended BO .

7.png

8.png

Step 7: Right click on extended business object click on create script file. In this case I am selecting Root-After Modify Script and write the logic as per your requirement.

10.png

9.png

Activate the solution and Test the solution.

11.png

If any thing is missing please let me know.

 

Thanks,

Quddus.

Solution Documentation Tool

$
0
0

Motivation

In my past as an Developer for the Cloud Application Studio I missed an automatic documentation generator which builds the documentation on base of your already written code and the comments. In other development environments this documentation generation is possible and so I decided to start building an simple generator for Cloud Solutions on my own.

 

Implementation

I Implemented the documentation generator in the current version with C# and the .NET Framework.

 

Current Features

  • Solution and Business Object File Parser
  • HTML Documentation Generation for each Business Object
  • HTML Overview Page for the Solution with links to the Business Object files
  • Capability to style the output as you like (CSS)

2015-09-18_0006.png

 

Getting Started

  • Clone the GitHub Repository (Rugosh/SolutionDocumentationGenerator · GitHub) and build the program or download the alpha release
  • Run the program with your Solution and Output path as parameters
    Example: SolutionDocumentationGenerator.exe C:\Users\tok\Documents\CopernicusIsolatedShell\Projects\BYD_DEV\YEKRNL1PY D:\dev\_testdata\bydDocu
  • View your generated Documentation

 

Forecast

I want to extend the documentation generator over the time with new features, such as other output formats (e.g. Word) and Parsing of other file types for more documentation possibilities.

To stay up to date what has changed and what you want to see visit the Repository (Rugosh/SolutionDocumentationGenerator · GitHub) and contribute.

Solution Documentation Tool

$
0
0

Motivation

In my past as an Developer for the Cloud Application Studio I missed an automatic documentation generator which builds the documentation on base of your already written code and the comments. In other development environments this documentation generation is possible and so I decided to start building an simple generator for Cloud Solutions on my own.

 

Implementation

I Implemented the documentation generator in the current version with C# and the .NET Framework.

 

Current Features

  • Solution and Business Object File Parser
  • HTML Documentation Generation for each Business Object
  • HTML Overview Page for the Solution with links to the Business Object files
  • Capability to style the output as you like (CSS)

2015-09-18_0006.png

 

Getting Started

  • Clone the GitHub Repository (Rugosh/SolutionDocumentationGenerator · GitHub) and build the program or download the alpha release
  • Run the program with your Solution and Output path as parameters
    Example: SolutionDocumentationGenerator.exe C:\Users\tok\Documents\CopernicusIsolatedShell\Projects\BYD_DEV\YEKRNL1PY D:\dev\_testdata\bydDocu
  • View your generated Documentation

 

Forecast

I want to extend the documentation generator over the time with new features, such as other output formats (e.g. Word) and Parsing of other file types for more documentation possibilities.

To stay up to date what has changed and what you want to see visit the Repository (Rugosh/SolutionDocumentationGenerator · GitHub) and contribute.

SAP Cloud Applications Studio: External Service Integration with XSD file

$
0
0

Hi,

 

In SDK you can create "External Web Service Integration" so that you could access external webservice in the business logic.


For this you need to do the following:

Upload .wsdl file and
optionally .xsd files if you have schema in different .xsd files of the external webservice, this would create .wsid file,

then you would activate the .wsid.

 

Suppose, If you have created .wsid uploading .wsdl and .xsd file and activation of this .wsid gives error "Recheck WSDL and upload again",
In such case check your .xsd file, it should not contain schema location refering to remote location.

 

Ex: your .xsd has <import schemaLocation=https://www.********************.xsd namespace="http://***********">
in this case schemaLocation inside import is refering to remote location, this would give error


Solution:

in such case , instead of using schema from remote location, download the schem file from the remote location and then upload this into .wsid

 

Thanks, Pradeep.

Opportunity Involved Parties OVS

$
0
0

I have written this blog post to document the solution I recently implemented to meet a specific requirement, in the hope that it will help others with similar requirements. The solution took a bit of time to get to as finding the root cause of errors in the UI Designer can be tricky, so hopefully documenting some of the pitfalls and how they were overcome will be useful.

 

A custom BO is created from an Opportunity on the Activities tab. The Opportunity ID is passed to the QC screen to create the custom BO linked to the Opportunity. The requirement is for the customer lookup on the QC screen to only bring back customers that are Involved Parties of the Opportunity.

 

The embedded component for the custom BO (Project Request) has been added to the Activities tab within the Opportunity and the New button calls the QC screen for this BO.

 

PR EC.jpg

 

The Quick Create screen is shown below. The Customer field is the one that requires a custom lookup to only show Involved Parties of the linked Opportunity.

 

QC.jpg

 

First I created an OVS based on the Opportunity BO.

 

Create OVS.jpg

Create OVS 2.jpg

When opening the OVS in the UI Designer, the first thing to do was delete the standard query and create a SADL query. This can be done by right clicking on the query and deleting and then right clicking on the Queries folder and clicking Create New SADL Query.

 

The key thing here is that by deleting the standard query and creating a new one it will create a new event handler and clear the query on the existing event handlers so you need to amend these as below, otherwise you will get errors when launching the OVS.

 

Check the properties of the defaultset query and set the OnSelect event to call the GetValueHelpHandler

SADL Query.jpg

DefaultSet Properties.jpg

Amend the existing event handlers to use the new SADL query

 

GetValueHelp.jpg

 

RetrieveValueHelp.jpg

SetValueHelp.jpg

 

 

I set the query parameters as below using the Party node. Originally I wanted to use the ExternalParty association, however this doesn't work and results in SADL query generation errors (the downside of using the Party node is that all sales team parties are also returned so have to be filtered out by the query).

 

Note that the ID at the root level is not used as a selection parameter but the ID in the ToParent association to the Party node. If the ID of the root is used then you will only get one entry returned, rather than all the involved parties. Note also that the PartyUUID is also selected as result parameter, as passing the PartyID back via the Outport did not work in my scenario (this will be shown later).

 

The Basic Find indicator is set for PartyID and FormattedName as this allows the user to search by number of name using the SearchText field.

Query1.jpg

 

Query2.jpg

 

I want the query to be based on Opportunity ID so this is passed into the QC on the Inport. An event is fired on initialisation to assign the ID to the query search parameters.

 

Inport.jpg

AssignEH.jpg

The data model now looks like this:

 

DataModel.jpg

Because the Party node returns all parties linked to the Opportunity (both external and the sales team) I needed a way to filter out the sales team. To do this I decided on using the PartyTypeCode and created two defaultsets, one to query 147 - Partner and one for 159 - Account. The RoleCode could also be used for this. Remember when creating new defaultsets to change the eventhandler triggered to GetValueHelpHandler!

 

defaultset accounts.JPG

defaultset partners.JPG

 

Now for the Designer. On the AdvancedListPane, the defaultsets need to be assigned as below:

 

ALP.JPG

defaultsets.JPG

Note that when changing between the defaultsets the query was losing the ID search parameter, so I added the below assign DataOperation to the GetValueHelpHandler to set the ID each time the defaultset is changed.

 

GetValueHelp2.JPG

 

The columns in the AdvancedListPane also need to be bound to the correct field in the data model:

 

column.jpg

 

Finally the Outport needs to be amended to bind the UUID and Name to pass these back to the field on the QC.

 

outport.jpg

 

Now the OVS is linked to the field on the QC and the Opportunity ID field is bound as below:

 

QC2.JPG

OVS binding.JPG

 

So lets see it working:

 

These are the Involved Parties of the Opportunity

 

invovled parties.jpg

This is the OVS on the QC customer field

 

partners.JPG

account.JPG


Simple steps-creation of Odata service on Custom BO

$
0
0

Dear friends,

 

Many of you might already aware of how to work Odata Service operation like create, read, update so on.. I am just Targeting here beginners ,hope this document will help on certain level of understanding.

 

I am going to explain with simple example with few screen shot.. Lets start

 

1. Open cloud solution studio , create a sample business object. here my business object name is "TestBO" with few element in it.

 

1.png

2. Created BO is successfully activated.Required screen and logic is added.(as per business need).

3. Login System as run time, go to Administrator workcenter and select ODATA service Explorer.

2.PNG

4.In Show filter select the Custom Odata service option and click New3.png

5. In ODATA Editor select Target BO and Node Name where do you want to create a service on it.

 

4.png

 

6. Selected Business object entity type is created, and Do the appropriate selection as per your requirements.

 

5.png

7. Function Inports you can configure like this.

7.png

 

 

8. After completion of all above steps, please save the setting and Activate the service. After activation we can see Service URL at top of screen. and can check the metadata structure by using same service url in browser like this.

 

8.png

9. After successful activation we can see custom odata service created in system, Now we need to test the scenario whether created service working properly or not. So press test button over here marked yellow..

9.png
10. After clicking Test button , there will new window open up. Here I am going through  perform create operation ,  In Request method POST method is selected and alternative key Test ID field Entered.

 

and Click on Execute, you will get successfully uploaded dialogue message box and Response of the same message we can see here down in Response section.

 

10.png

 

11. For confirmation of data saved in custom BO or not you can test same in solution through executing query on test BO. We can see new instance is created on TestBO(Test Id=767).

 

11.png

 

Now service is working fine. Now you can share service with user

 

Appreciated expertise input on this, if anything missed or any necessary configuration need to take care. "Happy Day".

 

Br,

 

Naveen Kumar N

Extended node in Analytics

$
0
0

From 1511 release onwards it is possible to include extended node in the Datasource as mentioned at: http://scn.sap.com/docs/DOC-68674 with Description "Node Extensions -Usage in Analytics".


But, the extended node cannot be enhanced into the standard datasource, you could instead :

create new datasource on the Standard BusinessObject and

then join the custom datasource with the standard datasource.


Thanks, Pradeep.

Error while activating a Solution.

$
0
0

Dear All,

 

Most of them would be aware of this, but just thought of sharing it.

 

If the deployment of your solution is failing because of the below error you can refer this blog .

 

Steps to Reproduce this issue.

 

  1. Login to Cloud Application Studio.
  2. Open your required Solution.
  3. Go to Implementation Manager tab.
  4. Click on Activate.
  5. Error pops up.

 

Error during Activation of Solution failed.

 

Please refer the below screen shot:

 

 

error 1.png

 

Cause -

 

User is not assigned to Business Analytics Work center and hence activation of Analytics content is failed.

 

Resolution is Simple -

 

User should perform the below steps:

 

  1. Go to Application and User Management Work Center.
  2. Select the Business User.
  3. Click on Edit then Access Rights.
  4. In the tab Work Center and View Assignment, click on button find.
  5. Search for Business Analytics Work Center.
  6. Select the check box.
  7. Click Save and Close.

 

After performing the above steps, Activation of the solution will not fail for this error message.

 

Regard
Anant

Develop Better Service: New version of Do's & Don'ts Guide available

$
0
0

As part of SAP's Develop Better Services for ISV partners I'm happy to make you aware of our new version of the 'Do's & Don'ts Guide'. The Do's & Don'ts Guide guides partners who develop Business ByDesign solutions via the Cloud Application Studio with comprehensive information about Do's and Don'ts during development coming from years of experience in development from our experts in SAP's Cloud Service Center. On top we have included special chapters coming from the quarterly 'Develop Better Webinars'.

 

Please use and share this Do's & Don'ts Guide as much as possible in order to improve your development quality as Business ByDesign ISV:

 

http://www.sme.sap.com/irj/sme/solutions?rid=/library/uuid/00657fc7-23de-3310-5693-ccd9c3c67ccb

Some Thoughts about Queries

$
0
0

Hello,

 

I am not sure if everybody understands the concept of the different queries which are supported in PDI and when to use which.

Let me shed some light on this.

 

Three Kinds of Queries

 

PDI is supporting three different kinds of queries.

  1. QueryByElement
    • I am quite sure everybody knows these queries which are generated for every node by the framework automagically.
    • These queries support only the elements of the own node, for the selection parameter as well as for the result.
    • As they access direct the database table there is no optimization for this query access.
      This means a shot execution can only happen if the key fields are part of the selection conditions. These are
      • Node ID (which you will probably never use in a search)
      • Alternative Key annotated elements.
    • Furthermore this is only true if the operator for the selection is the equal comparison.
    • This makes theses queries not the preferred ones if your selection criteria do not match these conditions.
      Especially if there are many entries in the database table.

  2. QueryBuilder
    • You define a (technical name) Query Response Transformation Node (QRTN) which can make use of any element which can be reached from the node to this query is attached.
    • The implementation is based on the Fast Search Infrastructure (FSI) so you can expect
      • A fast execution
      • Support of any comparison operator
    • This query will give you a quick access to the search data in many places.
  3. SADL based Queries
    • These queries are defined direct in the UI (typically OWLs) and grasp direct to the data avoiding any framework overload.
    • This access is not only dedicated for key access but also for query, so you can also expect
      • A fast execution
      • Support of any comparison operator
    • As there is no further propagation of the query definition it can only be used in the the UI in which it is defined.
      No reuse.

 

I hope your decision about "Which query shall I use for my case?" can now be answered more easily.

 

That's all, folks,

Horst

Viewing all 81 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>