Quantcast
Channel: SAP Cloud Applications Studio
Viewing all 81 articles
Browse latest View live

Studio 1502 Release and What's new document

$
0
0

The SAP Cloud Applications Studio 1502 release installer is now available for download from the Service market place.


Please use the search text “SAP cloud applications Studio” and use the installer listed.


Studio_1502_Installer_link.png

 

The Studio 1502 What's New document is also now available.

Check out the SAP Cloud Applications Studio version 1502What's New and Documentation athttp://help.sap.com/studio_cloud.


1502: Org.Centre as Company - Customer-Mandatory-Check-Issue - Workaround

$
0
0

Hi community,

 

in the following blog post I would like to describe an issue, that we faced once in a project and for that we now have a good solution.

 

Scenario / Problem description:

When defining/adjusting the org-model you will maybe have one or more Org.-Units with the Legal Definition "Company" (see screenshot below).

18-03-2015 15-45-15.png

You might not know that there is a customer created in the background. This can get a problem regarding mandatory-checks on the customer:

If you have a Validation-OnSave.absl-script for the customer-BO your validation will succeed for customers saved via QC / integration, if your mandatory-fields are filled. But the customer, that get's created during the org.-model-set-up won't have the mandatory-fields set. ==> You will receive an error during the creation of your Org.Unit (company)

 

This was pretty annoying, but now there is a solution for it!

 

Solution:

We have a new indicator available on the Customer-BO called "ActsAsOrganisationalCentreIndicator". If you need to have some mandatory logic on the customer-BO you now can exclude the above described customers from your check in your Validation-OnSave.absl of your Customer XBO.

 

Example-Coding:

18-03-2015 15-54-50.png

Maybe this can help you once.

 

Best regards

Tom

openSAP course starting soon for "Application Development for SAP Business ByDesign"

$
0
0

Many companies today have chosen SAP Business ByDesign to run their business. This cloud solution is a full and completely integrated suite, providing a vast area of functionality. Of course, every customer is unique in what they do and how they run their business. At SAP we now this all too well and SAP Business ByDesign is built to support customer extensions. By using the SAP Cloud Applications Studio you can provide rich and powerful enhancements to your customers and/or your organization. Application Development for SAP Business ByDesign on openSAP will teach you how to develop add-ons and enhancements to meet these unique customer needs!

 

The course will start June 3rd and run for 6 weeks. It will cover a wide variety of topics including creating custom business objects and screens, enhancing standard functionality, analytics, integration, and many other areas. Anyone can sign up, even if you do not already have a license for the SAP Cloud Applications Studio. For such students, virtual environments will be provided for the duration of the course, so they may also participate (see course page for more information).

 

Please visit the course page for more detailed information and an introductory video. We are very excited to offer this course and hope you will be able to participate! Sign up today!

 

 

Application Development for SAP Business ByDesign - Nick Rose

 

 

Cheers!
Nick

Certification for SAP Cloud Applications Studio

Sending SMS via mobile provider's REST service.

$
0
0

Hi All,

 

 

In this post I would like to show you how can we send SMS messages from ByD utilising REST service. Although I will be using an Australian mobile provider, the process would be similar to any other REST based service.

 

SDK Items used in this example:

- External Web Service Integration - to define the REST service

- Custom BO - to define the communication parameters and recipient's phone number / message

 

Prerequisites:

- You will have to create a FREE account at https://dev.telstra.com and apply for the API Key and API Secret.  

- For now the SMS are restricted to Australian phone numbers.

 

Sending SMS in our example involves two REST services, first one is called with our API Key /Secret and returns an OAuth token. Then, the token is used to call the second service and send the SMS.

 

1. Create a new BO:

 

 

smsbo1.jpg

2. Activate and create new QA screen with the above elements.

 

smsbo2.jpg

 

3. Add first External Web Service Integration item  (EWS) to the solution. Select REST as the service type.

 

smsbo3.jpg

Paste the URL to the first web service (obtained from the DEV portal), you can include the parameters but it won't matter at this stage as they will be defined in the ABSL later on anyway. As you can see the SDK will pickup the path automatically.

smsbo4.jpg

 

4. In the next screen tick the box "Create communication scenario". Do not enter any API Keys. Click finish.

    You should end up with two new items in the solution explorer.

 

smsbo5.jpg

5. Lets now create the second EWS.

 

Note the new URL of the second web service.

 

smsbo6.jpg

 

6. After activating your objects, login to the tenant to add our new communication scenarios.

- User and Application Management -> Communication Arrangements -> New

 

Select our first webservice EWS_SMS_Auth

 

smsbo7.jpg

 

Select the associated communication system:

 

smsbo8.jpg

 

Ignore the Basic setting page and go straight to the Advanced settings. For our basic scenario we will set the Auth method to None and untick the "Use Basic Settings".

 

smsbo9.jpg

Finish and active the new comm arrangement.

 

7. Repeat the same steps for the Second web service EWS_SMS_Send.

 

8. Back to studio. Create a new script file for the action we defined earlier in the BO: sendSMS.

 

- enter your API key and API secret.

 

ABSL code is fairly self explanatory, we specify the service name that we want to use, create parameters and execute the service. The only difference between the two REST services is that the second one uses POST call and has actual body content containing the message and recipient's phone number. The only tricky bit was getting the body string right.

smsbo10.jpg

 

9. Right click on the QA screen in the SDK and select Preview.

 

 

- enter a phone number and a message

  smsbo11.jpg

 

10. After you've hit Send button you should see the response back from the two services. The SMS result string contains the unique message ID that can be later used to query the status of the message (pending, delivered etc).

 

smsbo12.jpg

And here is our SMS:

 

smsbo13.jpg

 

And that's it!  I hope this post will be helpful to others working on something similar.

 

 

ABSL code is attached.

 

 

Cheers,

Paul

SAP C4C SDK: Re-use Function to fetch the Day for Specific Date

$
0
0

Background:

 

I had the requirement, where I need to have the Day for the specific date. Example if the entered date is 06/25/2015, I want to get Thursday. I had created the discussion on SCN for same. Below is the link to the discussion.

 

C4C SDK: Fetching Day for the specified Date

 

I was suggested by Cloud Application Studio expert Alessandro Iannaccito create the re-use function. However Cloud Application Studio expert Horst Schaudehas mentioned that SAP will be get this as function in next release. But I thought of getting the experience to create my first new Re-use function.

Thought of sharing this with the Community.

 

Note: I have created the Custom BO just to implement and test this functionality before implementing on any standard.

 

 

 

Business Object Definition

import AP.Common.GDT as apCommonGDT;

 

 

businessobject DayReuseLib {

 

 

[AlternativeKey] [Label("Enter Date")] element tdate : Date;

[Label("Today")] element today : LANGUAGEINDEPENDENT_ENCRYPTED_MEDIUM_Description;

 

 

[Transient][Label("Leap Year")] element leap : Indicator; // indicator For Leap Year Check

}

 

Event- AfterModify.absl

import ABSL;

import AP.Common.GDT as apCommonGDT;

 

 

 

var value1;

var year;

 

 

 

if(!this.tdate.IsInitial()) {

// ********* Checking if Leap Year ***************** //

var nyear : NumberValue;

nyear = year;

this.leap = false;

var leap : FloatValue;

leap = nyear/4;

if (!leap.ToString().Contains(".")) {

  this.leap = true;

  }

 

 

// Call Reuse function to get the Day

value1  = Library::DayReuseLib.ReturnDayfromDate(this.tdate,this.leap);

 

 

  if(!value1.IsInitial()){

  this.today = value1; // Display the Day on the screen

  }

 

Below is the Re-Use function coding.

 

Screen1.JPGScreen2.JPG

Screen3.JPG

Screen4.JPG

 

Hopefully this might be helpful to the community.


Cheers.



ONE OFF conflict while tenant move.

$
0
0

Dear All,

 

We do receive many incidents with ONE OFF conflict analysis in the Tagert system.

 

Either an automated incident is created from  the system or provisioning team observes this mismatch while tenant copy is triggered by the customer.

 

This happens as the Source system will have a higher Version of Addon than Taget system. i.e. Production system will have a higher Version than the test system. OR Vice Versa.

 

Addon Version of the orginal solution has to be same for a tenant copy, which is a precheck for tenant copy process. This is mandatory to avoid the future inconsistencies in the system.

 

Steps to Resolve this  issue.

 

  • Login to CLoud Application Studio.

 

  • Upload the latest Zip file of the original Solution in the respective target system.

 

  • Activate the uploaded

Mass Data Upload For Custom Business Object for Sub Node.

$
0
0

As Mr.  Host Schaude already posted a blog regarding XML File input. Now i am describing here  how to upload xml file  if  my Custom BO contains sub node ,with reference of Host's blog.

 

Have a look , how to create a service integration via XML File Input.

 

Here I want to describe what steps need to be done if the data of a Custom Business Object shall be uploaded.

 

For this task we choose the service integration via XML File Input.

At the end we will be able to upload the data even periodically.

Step 1: Create the Custom Business Object


Here is a very simple Custom BO with some fields at the Root node and a sub node with a 1 to n multiplicity.

If you want not only a simple upload (= Create) but also an update (= Modify) you should provide an Alternative Key for the Root node.


 

  

There is even a navigation association based on a relationship code (which is a custom code list data type).

Have this Custom BO active before the next step.

Step 2 : Create an XML File Input Service

First choose from the context menu of the Custom BO the entry “Create Service Integration”.

 

  

Second in the upcoming window select the radio button “XML File Input” and press “Next”.

  


Now the fields for the basic information (e.g. receiving BO) are already filled. You may adjust only the integration name.


  

In the next window you can select the elements of the Custom BO which shall become part of the XML file so they can be uploaded.

 

  

As we want to do mass upload we set the flag “Mass Processing”.

 

Now we need to define how the Custom BO instance can be identified in case of update. For sub nodes this is done via a combination of fields.


  


Finally save the Service Integration and activate it.

 


A WebDAV folder is created which you need later to place the XML files in.

If you re-open the Service Integration you will be able to download the XSD schema definition from the “General” tab.


  

 

 

 

 

Step 3: Create the XML File

 

Use the XSD from above to create your XML files.

  • One way (for Business Objects with only a Root node) is via Excel
    • Menu Developer -> Click on Source -> Add the XSD via "XML Maps..."
    • All entries with the red star are mandatory
    • From the "MessageHeader" you need only the "CreationDateTime"
    • In my example only the "MyFamily" root node is supported 
  • Another solution is to open the XSD in MS Visual Studio.
    • From 2008 SP 1 on you can generate sample code in the XML Schema Explorer
    • Make sure that you right click on the element typed the request


                            

  

    • In the generated code you delete from the MessageHeader tag all entries except the CreationDateTime tag
    • In the List tag you will find all nodes and fields from your Custom BO.
      • You may add or remove some tags for additional or superflous node instances
      • You need to replace the sample data by your real data

                          

    

 

    • The third way I know is a commercial tool named Oxygen XML Editor

Now i want to describe XML file input if Your Custom business object has sub node.



  Save this XML file. Open the Excel and go to the developer tab. Click on source -> click on xml map -> click on add browse and select your xml file.

  Here i have taken  an Example , In This i have one root node  > UPLOAD and one sub node   >  PRODUCT

 

 

Once your file will added, your all elements will show in right side- just click on your root node ,drag and drop in the sheet.





 

Now  your all elements will be on the sheet . your root node and your item node is now available in the sheet .

Just do one thing select your root element –means header part , copy the header part only and select a single shell of excel and (paste special  as transpose).

Once you paste it , delete the existing header element from sheet.

 

 

 

Now your excel sheet will look like this ->

 

Now We have to bind the header element from XML Source.

 

 

 

 

 

 

 

 

Now it is ready ........................

 

 

Go to Developer Tab Export it in XML -Save it and Go to (Application And User Management) WOC Sub view -File Input and upload your xml file.


Thanks,

Manoj Kannaujiya.

 


Patch Version is not same as the Original Solution.

$
0
0

Below are the behaviour of the patch which is getting created through the different systems:

 

Create a Patch on a Customer's Test Tenant.


  • Login to Cloud Application Studio.
  • Select the Solution.
  • On implementation Manager toolbar, click on create Patch button.


The system creates a patch solution and copies all in the files in yor solution to a new namespace within the patch solution.The namespace is generated automatically and cannot be changed.


Create a Patch on your Development tenant.

 

  • Login to Colud Application Studio.
  • Select the Solution
  • On implementation Manger toolbar, click on create Patch button.

 

The system creates a patch in the same solution and sets the solution status In Development. The Version number of the solution is updated to the higher Version.

 

Patch is not Getting Assembled to Orginal Solution but to the Patch Version.

 

 

Now, if the orginal solution exist in the Production system and you are creating a patch and you may observe that, the patch you have created is not getting uploaded to the original Solution, but the patch version of the solution in the test tenant.

 

Reproducing the Issue:

 

  • Open the Original Solution in the test tenant.
  • Go to the current version tab of the Implementation Manager.
  • Click on the button Upload.
  • The patch will be uplaoded with the new namespace and not to the original solution.

 

The System behaviour is as expected. The reason why the patch is directly uploaded with the new namespace is that the test tenant and the production tenant are in the same system. In this case if you try to upload the solution you will get a pop window with warning message : Please note that an older version of this solution already exists on the produciton tenant.

 

Patch.png

 

If you click on OK in this pop up then the solution will be uploaded with new namespace. So if its only test tenants in the system then original namespace would have been picked during upload of patch.

 

Then you have an option to directly upload the patch solution to the production tenant and will trigger an automatic deployment in the test tenant with same Version.

 

Here you dont need to do any manual actions in the test tenant to bring the patch to the same level, once the patch is uploaded to the production tenant it will be automatically synced with test tenant as both tenant are in the same system.

How To Create an Embedded Report (Any Data Source) and pass values

$
0
0

This blog post will help you embed any report based on a standard OR custom data source and also pass values to your report selection criteria.

 

I could not find any documentation about this feature, SAP informed me of this feature after I raised an incident.

SAP if you are reading this and there is documentation available publicly please post in the comments and I will update.

 

You do not need cloud studio to perform this task. I actually tried this with cloud studio but could not achieve the requirement.

See my attempt here if you are interested:

Cloud Studio SDK: Can we create embedded reports

 

Requirement:

Embedded sales reporting in the customer account screen. Our sales data came from a custom data source we generated and we need this available for our users to view individual customer sales data without having to open the 'analysis' work centre. This means we need to pass Account ID to our report from the Accounts Screen/BO.

 

 

Prerequisites:

You will need to have administrator access to the Business Analytics WorkCentre.

You will also need a browser such as IE so you can access silverlight. (Chrome no longer supports silverlight)

 

Step 1: Create your report

Open up C4C in silverlight mode so you can get access to the Business Analytics workcentre.

Create your report with the key figures you will need. In my example I created a report over a custom data source.

The report name doesn't matter too much but I would suggest you note where this report will be used so other administrators don't edit it by accident.

SetupReport.png

Step 2: Create View for Your Report

This part is important. Make sure the report view name is something meaningful to the end user as the title will be displayed on the screen.

Set up the rows/columns exactly how you want the report to appear to the user.

CreateViewReport.png

Finally I had the requirement to pass 'year' to my embedded report. This isn't available from the Accounts Screen so I had to save this as part of the default report view to ensure it will run correctly. SAP provide the ability to set a relative selection date, this means you can set current and previous year very easily (a very cool feature).

SettingYearParameter.png

Save your view and selection as report default (important).


Step 3: Assign the report to the customer workcentre

If you miss this step, the end user will receive an authorisation error! If you are embedding your report in another screen you obviously use a different workcentre.

AssignReport.png

 

 

Step 4: Go back to HTML5 view (if this is the view that your users are utilising) and backup your page layouts

I suggest you do a quick backup of ALL layouts so you can easily reimport if something goes horribly wrong. Just click Export Layout and select all layouts.

exportpagelayout.png

Step 5: Open up the screen you wish to embed your report. Edit Master Layout.

In my case I created a new tab especially for these reports, but you can embed them anywhere you please. Take note - Any changes you make now will happen in REAL TIME. Be careful in production tenant.

 

One more note - I found it to be a little bit slow at times - Be patient! It will eventually process your clicks!

 

When you create a tab it will create a section for you automatically as below:

section.png

Click on the circle with the + inside to add new items... Hang on a minute... where is the option to add embedded report? This was a hard to find feature!

 

Click the little UP ARROW on the right. This will navigate to the parent.

Hang on... there is still no option for embedded reports!

Click the little UP ARROW once more. BOOM! The option becomes available.

addreport.png

Step 6: Select the report (search by view name as defined in step 2) and configure report parameters.

In my case I needed to pass account ID so I mapped the screen field Account (*) to the UUID report parameter.

If you followed my instructions carefully at the end of step 2, tick 'Report Default' for Year and it will be set to current YYYY and last YYYY.

ReportParameters.png

One more important note! Untick the 'show collapsed view'. Otherwise you have to expand the report manually each time you open the screen.

 

Step 7: Click 'End Layout Changes' from the adapt menu and reopen your screen to test

The fun part - has it worked? If you have configured your views and parameters correctly the report will open as below!

RESULT.png

 

ONE BIG LIMITATION - Embedded reports ignore saved report view defaults and will not open as a Line graph or a column by default (opens in table view). The user has to manually change this each time. I am going to raise an incident about this issue.

SAP Cloud Application Studio Performance Best Practices

$
0
0

!!! THIS DOCUMENT IS WORK IN PROGRESS !!!

Last Update: 27. Aug. 2015

 

In order to develop quality and fast applications with the SAP Cloud Applications Studio, you need to educate yourself about how to use the toolset properly.


Check for updated features


SAP Cloud for Customer is getting updated every 3 months. The added features are highlighted in the What's New section in the documentation at help.sap.com/studio_cloud. These features are often capable of replacing expensive workarounds you had to do in the past to archive the same functionality.

 

Event execution iterations


Be aware of the execution logic and always think about how often events are called at runtime and if you can reduce the amount of iterations.


  • AfterLoading: Is executed when a document is loaded (buffer is filled, UI is not yet displayed)
  • AfterModify: Is executed for each node update. Can generate long loops when updating other nodes.
  • BeforeSave: Is executed for each node save. Not as expensive as the AfterModify


  Also think about alternative channels that can update a business object: 

 

  • Integration: You can exclude code execution from integration by checking if Identity.BusinessPartnerUUID() is set.
  • Migration: You can exclude code execution from integration by adding an identifier to the migration template that can be filled and checked in the code.


Sometimes it is advisable to do expensive calculations within a dedicated action that can be called from the UI. For example calculate the item summary on the header only when the add or delete button is clicked on an item list. 

 

Too many retrieves by navigating through instances


The ABSL language makes it very easy to navigation through objects. Behind the scenes objects are being retrieved and discarded. In a nutshell, every dot retrieves something. Either a node or an associations. Accessing a node is fast, retrieving an object by association much slower and doing it over and over again easily adds up to several seconds.


For performance reasons, retrieves by association should be kept at the possible minimum. Results from retrieve by associations should be buffered in the coding if possible.


Example:

In the below code, toParent, toBusinessPartner, EmployeeResponsible and DefaultAddress are representing associations. The code results in 13 retrieves on the server side.



if(this.toBusinessPartner.IsSet()) {  if(this.toBusinessPartner.EmployeeResponsible.IsSet()) {    if(this.toBusinessPartner.EmployeeResponsible.Address.DefaultAddress.IsSet()) {       this.toParent.RespEmplAddrStreet = this.toBusinessPartner.EmployeeResponsible.Address.DefaultAddress.Street;       this.toParent.RespEmplAddrPostCode = this.toBusinessPartner.EmployeeResponsible.Address.DefaultAddress.PostCode;    }  }
}

 

A better code example would save the instances in local variable. The example below works with 7 associations, would speed it up by factor two.

 

if(this.toBusinessPartner.IsSet()) {  if(this.toBusinessPartner.EmployeeResponsible.IsSet()) {    var emplResp = this.toBusinessPartner.EmployeeResponsible;    if(emplResp.Address.DefaultAddress.IsSet()) {      var emplRespAddress = emplResp.Address.DefaultAddress;       var parent = this.toParent;       parent.RespEmplAddrStreet = emplRespAddress.Street;       parent.RespEmplAddrPostCode = emplRespAddress.PostCode;    }  }
}

 

 

Keep in mind that .IsSet() leads to an association. Avoid redundant “retrieve by association” operations by storing the result of an operation in a variable / collection

 

Using association in trace statements: Even if the trace is not active, the content inside the trace statement is evaluated. This also might lead to “retrieve by association” operations that can be easily avoided

 

Remove trace statements


A pretty easy improvement is the removal of trace statements. If you have code like this in your project:


Trace.Info("Instance Count", this.toAnotherBO.Count());


The this.toAnotherBO.Count() is getting retrieved and executed even when the trace is not set to active.


Avoid save events


As a best practice, you should not trigger a save from the UI at all. While a user is in edit mode, he should be able to work on a document. If the user hits cancel, everything he did before hitting save the last time should be rolled back. This does not work when a save is triggered as a part of the application logic.


Use buffered retrieves instead of queries


The retrieve method retrieves an object from the current buffer. This is the fastest and the best choice to get access to an object. Using the query does bypass the buffer and is therefore slower in most of the cases.

 

Database queries


Due to the fact, that a query bypasses the buffer and is executed on database level, the query is slow. Try to find ways to use retrieve. Even if you have to retrieve an intermediate object first, it is often faster than using the query.


If you have to use the query, use query.ExecuteDataOnly() if you're only interested in the result data and not the object instances as instance type.


Usage of QueryByElements (auto generated query)


The default query QueryByElements does not support full index search. It has in general a linear dependency on the number of instances in the business object node t = O(n), where n is the number of BO instances in the database).


Therefore it should be used only if:


  • The expected number of records in the node is small (< 1000), for example in case of a object with configuration data, or
  • The selection parameter list contains an equal condition on an element that is alternative key of the node. An alternative key is supported by an index so that the runtime dependency is t = O(log n).


In all other cases, an application-defined query (defined with the Query Wizard) has to be used. An application-defined query supports full index search on all query elements t = O(log n). This advice holds for query calls in BO implementations, UIs, web services, etc. Independently from the used query, the number of selected instances must be as small as possible, as the time depends with linear dependency from the number of selected instances (t = O(m), where m is the number of BO selected(!) instances. If possible define a join query in the wizard instead of selecting a large amount of data and do the selection in your coding.


Where and Sort operations on collections are available and make it possible to reduce the number of nested loops.


Mass enabled events


Mass-enabling of actions and events is supported. In mass-enabled script files, the “this” operator is a collection of business object nodes instead of a single instance.


Nested loops


Nested loops (foreach, while) on collections with a large number of members should be avoided, because they lead to a runtime t = O (n * m * ...).


  • Where and Sort operations on collections are available that make it possible to reduce the number of nested loops.
  • Mass-enabling of actions and events is supported. In mass-enabled script files, the “this” operator is a collection of business object nodes instead of a single instance.


Execution times 

 

These numbers have been collected on a small test solution and may be higher in bigger objects. These are by no means official numbers and not meant to be a KPI or performance indicator.


  • Retrieve: 44ms
  • Create Node: 67ms
  • Execute Query (1000 records): 16ms
  • Raise message: 67ms

 

Keep in mind that a retrieve runtime is stable while the query goes up logarithmic, which makes it extremely expensive on larger tables. Also keep in mind that this number only applies to defined queries and not the auto generated QueryByElement query, which are much slower.

 

Lazy load UI components


It is possible to influence the UI component loading sequence by enabling Lazy Loading.


  • Lazy Loading can be activated when adding Embedded Components to standard screens. It will result in an initialization of the embedded component when it gets displayed instead of an initialization when the host object gets loaded. This is often a good idea, but might lead to unintuitive behavior when the logic in the embedded component writes data back to the host BO for example. Then you would see different data on the host BO before you navigated to the embedded component the first time, and different data afterwards.
  • Lazy Loading can be activated on custom thing inspector level by turning the data scope handling attribute. This will lead to lazy loading of all thing inspector facets.

 

Enable operation clubbing

 

This feature allows the packing of multiple UI resources (javascript, css etc.) into one package. This effectively leads to a smaller number requests required on client side. It can be enabled by setting the floorplan property "Enable Backend Operations Clubbing"  to "true". This has an effect mostly on very slow connections with a high latency (mobile 2G/3G networks).

SAP Cloud Applications Studio Deployment & Landscape Basics

$
0
0

Preface

 

This document describes the basic development and deployment life-cycle for SAP Cloud for Customer custom development using the SAP Cloud Applications Studio. The SAP Cloud Applications Studio is a client application installed on the developer’s computer and connects using a secure SSL connection to the SAP Cloud for Customer System. One physical system can host multiple Cloud for Customer tenants. A tenant is an secure and isolated runtime with one URL. Using this technique, SAP is capable of hosting multiple SAP Cloud for Customer instances for one or more customers on one system.

 

The SAP Cloud Application Studio gives the developer an easy development environment where he can develop event and script based without knowing the system internal architecture.

 

All entities created with SAP Cloud Applications Studio are stored in the common area of the system. They are invisible for all customers but the customer for which the solution has been developed. Only this customer can activate the solution in his tenants. There is a recommended way to set up a system landscape that includes custom development.

 

 

Terminology

 

There are a few words that are often misunderstood and mixed up.

 

TermExplanation
SystemOne system is usually powered by two or more servers. How much hardware is powering a system is usually not important to know. One system is one SAP ECC installation containing multiple clients and tenants. System are named with three letter identifiers like "LEF" or "KLB".
ClientOne SAP ECC installation can host up to 999 clients. The clients are used to host tenants. There is a 1:1 relationship between the client and the tenant. clients are named by a 3 digit number like "010". Usually they come together with the system like "LEF 010" or "LEF/010".
TenantA tenant is one SAP Cloud for Customer instance running inside a SAP ECC installation. A tenant is running inside a SAP ECC client. The tenant is named by the tenant URL. It is build by the following notation: "https://my<tenantID>.crm.ondemand.com". The tenant ID is stable and should be used to identify a tenant. A tenant can be moved and therefore the client and the system hosting the tenant can change.
DeploymentA deployment describes the life-cycle step which copies (deploys) a solution using the SAP Cloud Applications Studio from one tenant to another. The deployment process does not necessarily involve SAP. The development party can download a solution to their computer (it is a zip archive) and upload and activate it on the target tenant.
SolutionA solution describes a custom developed AddOn, which has been built using the SAP Cloud Applications Studio. A solution is named like this "YDF1D4GKB2_ (My PDI Solution)". The prefix starting with Y..... is the unique identifier for the solution and should be used when communicating with SAP.
Role CodeEach tenant has a so called tenant role code. There are two role codes in SAP Cloud for Customer: "Test" and "Preproduction". The role code "Test" is used for test and development tenants. The role code "Preproduction" is used for production or production-like tenants. The tenant role code indicates in which landscape the tenant is hosted. This is important to know when it comes to upgrade cycles and downtime planning.

Default Landscape

 

When a customer signs up for SAP Cloud for Customer, SAP provides a test tenant as initial tenant, which is used to start the implementation project. It can act later on as test tenant. Over time a development tenant (if required) and a production tenant will be added.

 

The customer does not know on which server his tenant is located. This information is not required on customer side. However, it plays a role when setting up a SAP Cloud Applications Studio (also known as PDI for "Partner Development Infrastructure") landscape, due to its deployment architecture.

 

The recommended landscape for PDI development is a 3 tenant setup:

 

  • One development tenant (additional tenant, minimal configuration only for developers)
  • One test tenant (fully integrated and primary test system)
  • One production tenant (fully integrated for production use)

 

In a SAP Cloud for Customer contract, two tenants are included. This is enough for customers that do not need custom development. A third tenant should be bought in addition to build a full custom development landscape.

 

This picture shows the full process. The document will describe how to get there.

 

final_landscape.png

 

The landscape is usually built along the development progresses. We will have a closer look at the landscape step by step.

 

SAP provides the customer a test tenant. The customer uses this tenant to start the implementation project and configure it to his needs. Once configured, the customer requests a production tenant. The configuration gets copied over.

 

When the customer needs custom development, he (or the authorized development party) could connect to this test tenant with the SAP Cloud Applications Studio and start with the development. This is not recommended as this tenant is most likely integrated and well configured. Doing PDI developments will make this tenant incompatible with a later test or production tenant. Therefore it is necessary to order a dedicated development tenant (which technically is also a test tenant where PDI developments happens).

 

If an additional test tenant is requested before development is performed, the configuration from the initial tenant can be taken over. Once custom development has been performed, a test tenant can still be created, but the configuration must be redone manually as the configuration profile is now incompatible.

 

SAP only knows about two tenant types: "test" and "production". The full landscape for custom development contains two test tenants. Which of both is used for development doesn't matter. The term "development tenant" is used for the test tenant, that is used for development.

 

To make it simple, these are the possibilities:

 

 

  • Option 1 (preferred): Request an additional test tenant. Start development on the new test tenant. Keep testing activities on the initial tenant.
  • Option 2: Start development on the initial tenant. Request an additional test tenant. Move all testing activities to the new test tenant.

 

It is important to separate development from testing.

 

 

Phase 1: Perform Custom Development using SAP Cloud Applications Studio

 

The development party will create a solution, which is called "Original Solution". The actual solution name is the generated prefix value "ABCDE1234_". This prefix identifies the solution. This prefix is used in multiple places and is unique. Once a PDI solution got created, the tenant is now a development tenant. The developments are immediately visible in the system.

 

The developments can be tested in the development tenant. The development tenant should be configured only so much, that the developer can test. Do not rely on test data nor integration on the development tenant as there will be a change that will break both in Phase 3.

 

At the end of phase 1, there is one development tenant with a basic configuration and a test tenant which can be fully configured as far as possible without the custom developments.

 

The graph contains also the physical system AKL as example. This is only relevant when copying entire tenants. This is explained later. To be conistent in this document, the system is visible in all graphs.

 

1.png

 

Phase 2: First Deployment of Custom Development to the Test tenant

 

At a certain stage the custom development is either finished or at a stage where you want to test it in a well configured tenant. Then the solution should be transferred to the test tenant. The development party has to prepare the solution for the initial deployment. The solution must be error free and a Business Adaption Catalog entry must be created (developer task).

 

The development party can now deploy the solution to the test system.

 

2.png

 

PHASE 3: Switch the Development Tenant to Bug fix Mode

 

It is always advised to prepare the system for bug fixing, even if there are no bugs right now. Once a bug is found, the fix must usually be created as fast as possible, so it makes sense to perform these steps early. This is required only once in the entire solution life-cycle.


  • Step 1: Create Initial Patch: The development party creates the initial patch in the development tenant.


3.png


After step 1, changes in this patch are not visible for business users. The patch solution is a new solution in the tenant and step 2 will set this solution visible and the original solution invisible. This step is only required once and will not affect any other tenants.


Business users do see the active solution (orange). Changes in an inactive patch solution are invisible.


4.png


  • Step 2: Activate Patch Solution: The development party sets the patch solution active.


Activating a patch solution will disable the original solution. The patch solution has a different unique prefix. This has an effect on the runtime data.


5.png


To understand the consequences, some technical details should be understood. Example BO model from original solution ABCDE1234_:


businessobject ExampleBO {  [AlternativeKey] element ExampleBO_ID:ID;                   element Name:LANGUAGEINDEPENDENT_MEDIUM_Name;
}

 

This will generate SAP ABAP entities on the server with names containing the solution prefix.

 

Package: $ABCDE1234_
BO: ABCDE1234_EXAMPLEBO
 - NODE: ROOT      - ELEMENT: ABCDE1234_EXAMPLEBO_ID      - ELEMENT: ABCDE1234_NAME

 

Runtime data is stored in the generated business object. In parallel to the original solution (1), there is now the patch solution with a different prefix. This results in new business objects, elements, etc. So technically the patch solution is a full solution copy of the original solution. It looks the same on the frontend. Therefore from the frontend side, the user can see no difference. For the business user, is looks like the data has been deleted.

 

Bildschirmfoto 2015-08-27 um 17.00.21.jpg

Bildschirmfoto 2015-08-27 um 17.00.37.jpg

 

 

 

 

Activating the patch solution will set the patch solution active (2). Therefore all runtime data stored in the original solution will be invisible.

 

Tasks to perform:

  • Adapt integration with solution created web services to use the new name-space
  • Recreate solution specific fine tuning
  • Recreate test data

 

This step must only be performed in the development tenant. It does not affect any other tenants. This is the reason why it is advised to not rely on the development tenant test data/configuration.

 

The patch solution is now active.

 

  • Step 3: The patch solution is active from PDI perspective and even if the status is "Active for Business Users" it actually means, that the solution is visible in the system scoping. Before it really gets visible for business users, it must be scoped and configured. If the original solution was scoped before and configuration was done, the steps must be performed again.

 

Activating the patch solution, will first remove the original solution from scoping and then add the patch solution to the scoping. This resets the solution configuration.

 

PHASE 4: Life-cycle for patching and bug fixing

 

The tenants are now ready to run the full deployment and bug fixing cycle. Each development requires three activities:

  • Create new patch version (1)
  • Perform development
  • Deploy to test tenant (2)

 

The deployment process can convert a patch solution between the original solution and the patch solution namespace. Based on the tenant type, you can select to either create a patch on this tenant or replace the original solution. This conversion is called "aliasing". In the regular deployment process you will never need to create patch solution using a deployment. Usually you want to replace the original solution.

Bildschirmfoto 2015-08-27 um 17.04.10.jpg

 

Phase 5: Deployment to production

 

A production system can be requested at any time from the test system, once the test system exists. If a production tenant got requested, it usually comes without the PDI solution. It is then possible to deploy the PDI solution to the production tenant and then merge back the configuration from test to production.

 

The developed solution can be deployed to production either from the development or from the test tenant.

 

Bildschirmfoto 2015-08-27 um 17.04.10.jpg

 

It is okay to download the solution from the test tenant and deploy to production. However, it is not necessary as it would result in the same file, which was uploaded to the test system. So the file can be reused. The patch file also contains the full solution, so you only need to deploy the latest version.

 

Upgrade Life-cycle

 

In contrast to the SAP on-premise products, SAP Cloud for Customer is upgraded on a quarterly basis. Due to the fact that the systems are used by multiple customers, the upgrade times are static and cannot be changed. The SAP Cloud Landscape is divided into a test landscape and a production landscape:

  • Test Landscape: Maintenance windows on weekdays, release upgrades two weeks prior to production. All tenants in this landscape have the tenant role code "Test".
  • Production Landscape: Less customers per Server, maintenance windows on weekends/outside of business hours, upgrades  are performed two weeks after the successful test landscape upgrade.

 

Implications for PDI:

PDI solutions are compatible with up to two higher releases, but not to a lower release. The release which a PDI solution belongs to is the tenant release at the time when the PDI solution got downloaded. For example: A PDI solution got downloaded from a tenant with release 1411. This solution can be uploaded to a tenant with release 1502 and 1505. The upload will fail on a system with release 1408 (lower release) and 1511 (3 releases ahead).

 

The test systems are upgraded two weeks prior to production. In this two weeks, a patch created in the test landscape cannot be implemented in production.

  • Option 1: Wait until production is upgraded and upload the patch afterwards
  • Option 2: Use the intermediate bug fix functionality and fix production directly (only code is supported, functionality must be enabled by SAP)
  • Option 3: SAP can provide a temporary tenant (copy from production) which can be used to create a patch on the lower release and patch production immediately. This process is a lot of effort and should only be followed for very critical issues / escalations and can only be performed by SAP.

 

Bildschirmfoto 2015-08-27 um 17.11.03 1.jpg

 

 

SPECIAL CASE: Tenant Copy

 

In order to copy a tenant with custom development in it, a few topics need to be taken into account. Here it is necessary to know the system setup. SAP Hosting is usually taking care, but they do not always know about the exact purpose of a tenant. Therefore it is important to align with them to avoid stepping in one of the traps described below.

 

This is especially important when a tenant copy process was not trigger from the system, but via an incident or other non-official ways.

 

Scenario 1: Copy Development Tenant with only initial Development on same system

 

Bildschirmfoto 2015-08-27 um 17.16.59.jpg

Details: The initial development (see Phase 1) is performed in the Original Solution. The solution is in status "In Development". The same solution in this status can only exist on one tenant on one physical system. If the Cloud Applications Studio detects the same solution twice in status "In Development" on one physical system, it will disable both solutions and further development will not be possible until one solution has been removed (SAP support activity)

 

Options:

  • If the solution gets downloaded using the "Download and Assemble" function in the Implementation Manager, the solution status is set to "Assembled". In this status a tenant copy to the same physical system is possible. After the copy, it is possible to create a patch in one of the tenants.
  • It is possible to copy a tenant with an "In Development" solution to another physical system. You have then two development solutions that are independent. There is no way to sync the solutions later on. This can only be advised as a temporary solution where one of the tenants will be deleted afterwards or for demo or POC purpose where the life-cycle is not important.


Scenario 2: Copy Development Tenant with an Original and a Patch solution on the same system


Bildschirmfoto 2015-08-27 um 17.20.04.jpg

Details: The Original Solution is no problem in this scenario. The patch solution is in status "In Development". See "Scenario 1".


Options:

  • If the solution gets downloaded using the "Download and Assemble" function in the Implementation Manager, the solution status is set to "Assembled". In this status a tenant copy to the same physical system is possible. After the copy, it is possible to create a patch in one of the tenants. This is not a good solution as it is possible to create a patch on both tenants and this would result in two "In Development" solutions.
  • It is possible to copy a tenant with an "In Development" solution to another physical system. You have then two development solutions that are independent. There is no way to sync the solutions later on. This can only be advised as a temporary solution where one of the tenants will be deleted afterwards or for demo or POC purpose where the life-cycle is not important.
  • It is possible to "Download and Assemble" the solution and then deploy it on the same tenant to update the Original Solution. Then delete the Patch solution, perform the tenant copy and create the patch solution again from the Original Solution. When recreating the patch solution, you must perform the steps described in "Phase 2,3 and 4" again.


Scenario 3: Copy Development Tenant with only an Original Solution to another tenant

 

Bildschirmfoto 2015-08-27 um 17.22.19.jpg


This scenario is supported. If the Original Solution is in Status "In Development" you will end up with two development tenants that are independent. Make sure that the solution gets assembled on one tenant and that further development is performed only on the other tenant after the copy. Or assemble the solution before the copy is performed and create the initial patch only on the future development tenant.

 

 

Scenario 4: Copy Development Tenant with a Patch and an Original Solution to another tenant

 

Bildschirmfoto 2015-08-27 um 17.24.09.jpg

 

 

This scenario is supported. You will end up with two development tenants. Make sure that further development is only performed on one tenant. Also take into account that on both tenants the Patch solution is active. Therefore it is advised to delete the patch solution on one tenant and activate the original solution. Be aware that activating the Original Solution will result in the same consequences as described in chapter Phase 3.

 

Scenario 5: Copy a Test Tenant to another system.

 

Bildschirmfoto 2015-08-27 um 17.25.36.jpg

This scenario is supported and a recommended option.

 

 

Scenario 6: Copy a Test Tenant to another tenant on the same system

Bildschirmfoto 2015-08-27 um 17.26.25.jpg

 

 

 

This scenario is supported. Be aware that a solution on multiple tenants on one physical system must have the same version. If a patch is deployed for example to tenant 112, the solution will also be updated in tenant 113 (or the other way around).

 

 

Best Practice: Changing the development tenant

 

Sometimes it is necessary to move the PDI development to another tenant. How this can be done depends on the status of the PDI solution that needs to be moved. If custom development has not yet been assembled for a deployment, the landscape should look like this:

 

Bildschirmfoto 2015-08-27 um 17.27.23.jpg

 

The procedure in this case is simple:

  1. Download and Assemble the original solution in tenant 111
  2. Upload and Activate the solution in tenant 112 (1)
  3. Create a patch in the new development tenant (2)
  4. Activate the patch for business users (3) see also chapter phase 3
  5. Activate the solution in scoping and configure it

 

Make sure that nobody is creating a patch on the original development tenant. Parallel development is not supported. If both tenants are on the same system, both patch solution would be disabled and the situation would require SAP to resolve the situation.

 

This process can be performed if the new development tenant is located on the same system as well as if the new tenant is on another system.

 

If custom development has also progressed and patches have been created, you are most likely in this situation:

 

Bildschirmfoto 2015-08-27 um 17.48.45.jpg

 

The procedure in this case:

  1. Download and Assemble the patch solution in tenant 111
  2. Upload and Activate the solution in tenant 112 (1)
  3. Create a patch in the new development tenant (2)
  4. Activate the patch for business users (3) see also chapter phase 3
  5. Activate the solution in scoping and configure it

 

Make sure that nobody re-opens the patch solution in the original tenant. Parallel development is not supported. If both tenants are on the system, both patch solution would be disabled and the situation would require SAP to resolve the situation.

 

If you want to delete the patch solution on the original tenant, this must be performed before a patch is created in the target tenant when both tenants are on the same system.

 

About this document

 

This document has been written to answer common asked questions and it provides a full picture of the PDI deployment possibilities. The goal is to describe the solution life-cycle to partners, PDI-developers and project teams. Some processes have been simplified and are more complex under the hood. Following the described recommendations should save you from any PDI related lifecycle problems.

How to download & install the Cloud Applications Studio

$
0
0

I got asked multiple times where the Cloud Applications Studio can be downloaded and what is needed to install it.


The main page for the SAP Applications Studio is here:

https://wiki.sme.sap.com/wiki/display/AMI/SAP+Cloud+Applications+Studio

 

Here, the prerequisites and installation steps are documented:

https://wiki.sme.sap.com/wiki/display/AMI/Installation+Corner

 

Once these steps are completed, search for “SAP Cloud Application Studio” in the software download center:

https://support.sap.com/swdc


You can find the current version in the software download center:

(If you don't have access to download it, you need to request the authorizations from SAP)

sdk_install.png

Fix: BC Sets not visible in Fine-Tuning

$
0
0

Have you ever experienced the issue with BC sets that are not shown up correctly in fine tuning?

 

bco_issue.png


The BC view is showing the separator lines, but the BC View is not there?

 

This is most likely due to an inconsistency between the bac element and the BCO.

 

Please try the following activation sequence to fix the issue:

  1. Open and then activate the .bco file
  2. Open and then activate the .bcc file
  3. Repeat for all .bac and .bcc files missing in fine tuning
  4. Open the .bac file, hit next, next, next, finish.
  5. Activate the .bac file
  6. Deploy Business Configuration on solution level.

 

In step 4, check if the bco is assigned to a scoped business options. Even if you know it is, you need to open and save the .bac as it will do some consistency checks when it gets opened.


This should help you to bring the BC configuration back to life.


Best Regards,

Stefan

OBN Navigation to Thing Inspector Problem: "PackedKey"

$
0
0

If you have set up a list, which is not using a thing navigation to a TI screen, but an OBN navigation, you might end up having this problem.

 

An OBN navigation from a list to a TI floorplan works on the web, but not on the iPad. The ID will not be parsed correctly and will look like this:

 

"PDI_ABSL_IDENTIFIER$ObnBackendPackedKey$<?xml version="1.0" encoding="utf-16"?><asx:abap xmlns:asx="http://www.sap.com/abapxml" version="1.0"><asx:values><PARA>000000000000000000000000000000000000000000000000002000000432</PARA></asx:values></asx:abap>"

 

And on the iPad it will show up like this:

packedKey1.png

The solution is actually quite simple, if you know it.

 

Set the attribute "Use RawValue On ThingNavigation" to true in the Data Model on the Outport ID field of the source screen and the ID is passed without the XML junk.

packedKey2.png

 

The navigation will now work as expected on the iPad!


Best Regards,

Stefan


Script for a node duplicate check

$
0
0

A while ago I wrote a very short ABSL script which works very well and many people do not know why.

 

The goal for it was a duplicate check on a node. Imagine you have a BO with an Accounts node. Now you add Accounts to this node, but would like to prevent an Account to be added twice.

 

Put this into the AfterModify of the node:

 

// Check double accounts in list

foreach(varentryinthis.ToRoot.Accounts) {

       if(entry.ID == this.ID&& this != entry) {

             raiseMyMessage.Create("E", "Account can only be added once.");

       }

}

 

The interesting part is the if statement in the loop where it checks if there is already an entry with the same as the current ID. However, the new node entry is already in the buffer and therefore in the list. How do you know if you found the current one (which is ok) or a duplicate one (which is not ok)?


The magic is here: "this != entry" which compares the node IDs of the currently created entry with the node IDs in the loop. If these are different, you found the duplicate.


So keep in mind that "this" is also just a UUID which can sometimes become handy for usecases like this.


Best Regards,

Stefan

Understanding "Deployment Units"

$
0
0

Hello,

 

Sometimes issues are coming up where you cannot update fields on objects that are flagged as writable in the repository viewer. This is often the case, when your solution has been assigned to a deployment unit, which can not write to the business object you're trying to update.

 

Why does this happen?

 

When you create a Solution, you're asked to assign it to a deployment unit.

 

du.png

 

In this case, the deployment unit "Foundation" has been chosen. This leads to the behavior, that all custom business objects created in this solution will be created in the deployment unit "Foundation".

 

But what happens if you would like to access a business object in another deployment unit? Let try to update the delivery date of the purchase order.

 

read_only.PNG

This will lead to an error. But why? According to the repository explorer, write access is allowed!

 

rep.png

 

But the repository viewer also shows another information. The PurchaseOrder got greated in the deployment unit "Purchasing". Due to the logical encapsulation of the ByD units, only read access is allowed cross DU.

 

How to overcome this limit?

 

Well, this can be easy or difficult depending on what you're doing.

 

Option 1: You can "move" the custom business object to the Purchasing deployment unit. Then it will live next to the Purchase Order and has write access. This can be achieved with the annotation [DeploymentUnit(Purchasing)] in front of the business object definition.

bo.PNG

Option 2: The more difficult usecase is, when you have one object that needs to write to multiple objects in different deployment units. Then you can't move the business object to one deployment unit without losing write access to the other one. In this case you can create a helper business object in the read only deployment unit and communicate with it using a webservice. This case happens only in very rare cases.

 

 

Special deployment unit "Foundation"

 

The deployment unit "Foundation" is an exception. It is the underlying deployment unit and all other deployment units have write access to the Foundation (but not the other way around!). Therefore it is advisable to create a solution in the deployment unit that matches the purpose of the solution. Creating a solution in the Foundation deployment unit should only be used if you only need write access to master data objects.

 

I hope this helps you to understand deployment units a bit better.

Solution Documentation Tool

$
0
0

Motivation

In my past as an Developer for the Cloud Application Studio I missed an automatic documentation generator which builds the documentation on base of your already written code and the comments. In other development environments this documentation generation is possible and so I decided to start building an simple generator for Cloud Solutions on my own.

 

Implementation

I Implemented the documentation generator in the current version with C# and the .NET Framework.

 

Current Features

  • Solution and Business Object File Parser
  • HTML Documentation Generation for each Business Object
  • HTML Overview Page for the Solution with links to the Business Object files
  • Capability to style the output as you like (CSS)

2015-09-18_0006.png

 

Getting Started

  • Clone the GitHub Repository (Rugosh/SolutionDocumentationGenerator · GitHub) and build the program or download the alpha release
  • Run the program with your Solution and Output path as parameters
    Example: SolutionDocumentationGenerator.exe C:\Users\tok\Documents\CopernicusIsolatedShell\Projects\BYD_DEV\YEKRNL1PY D:\dev\_testdata\bydDocu
  • View your generated Documentation

 

Forecast

I want to extend the documentation generator over the time with new features, such as other output formats (e.g. Word) and Parsing of other file types for more documentation possibilities.

To stay up to date what has changed and what you want to see visit the Repository (Rugosh/SolutionDocumentationGenerator · GitHub) and contribute.

How to- SDK/PDI for Key Users and Admins

$
0
0

Dear Community,

 

While many of your are true experts in SDK/PDI, sometimes the simple things with the SDK/PDI tool are not explained with examples. I have created a blog series focused to Key Users about how to use SDK/PDI on the C4C community. Here it is.

 

Enjoy!

 

How to- SDK/PDI for Key Users and Admins

Extension Field Types: Adaption Mode / Page Layout / SAP SDK

Viewing all 81 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>