Quantcast
Channel: SAP Cloud Applications Studio
Viewing all 81 articles
Browse latest View live

Pre-requisites for creating a PDI related incident at SAP Support

$
0
0

This blog is just to outline the data that you need to provide while creating an incident with SAP support. This will help to avoid the delay in the processing of incident due to the lack of information and thereby can reduce the unnecessary back and forth communication.

 

  1. If you are reporting the issue from a different tenant, please specify the Tenant URL where the issue is actually occurring.
  2. Solution name and Customer name.

    1.jpg

  3. UI component name where issue is occurring with complete path of the affected UI component.
  4. Complete path from the SDK where you have developed the functionality with the relevant screenshots.

    3.jpg

  5. If the issue is in the UI, then steps to reproduce the issue in the UI.
  6. Permission to reproduce the issue at our end , if it includes any Operations that Changes the Data  like Save , Execute, etc
  7. Contact details in the incident so that support engineer can directly contact you in case of any clarity required

 

Provide these information whenever you create an incident for a better incident processing experience


Option missing to personalized query in the frontend

$
0
0

Dear All,

 

Personalized query is missing in the frontend HTML5 / Silverlight as shown below , eventhough it is defined correctly in the UI Designer sdk.

 


Untitled.png

Below is the screen from the UI Designer .

 

UI Designer.png

 

Solution : You should configure the anchors for the queries defined in the UI Designer.

 

Regards,

Deepty Mohnani

Create validation in Tax Number CNPJ

$
0
0

In order to develop a validation within extended fields that do not exist as standard as the TaxNumber in SAP CRM related to TaxType "BR1" in Brazil, follows the algorithm model that is possible used to perform validation tax document CNPJ.

 

 

This procedure can be improved and optimized by reusing libraries and loops

 

The following coding is only an initial reference

 

Create the follow objects

 

Put in the Validation "OnSave" the follow code

 

importABSL;

 

       varcnpj = this.CurrentCommon.CNPJ;

 

       cnpj = cnpj.Trim();

       cnpj = cnpj.Replace(".", "").Replace("-", "").Replace("/", "");

 

       varcnpjLength = cnpj.Length();

 

       if (cnpjLength != 14){

             raiseCX_INVALIDTAX.Create("E");

             posicao = false;

       }

       else{

             vartempCnpj = cnpj.Substring(0, 12);

             varsoma;

             varcalc;

             varcalcCnpj;

             varresto;

             vardigito;

    

             // Valida primeiro Digito 

             soma = 0;

             calcCnpj = cnpj.Substring(0, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*5;

 

                       calcCnpj = cnpj.Substring(1, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*4;

 

             calcCnpj = cnpj.Substring(2, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*3;

 

                       calcCnpj = cnpj.Substring(3, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*2;

 

             calcCnpj = cnpj.Substring(4, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*9;

 

                       calcCnpj = cnpj.Substring(5, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*8;

 

             calcCnpj = cnpj.Substring(6, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*7;

 

                       calcCnpj = cnpj.Substring(7, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*6;

 

             calcCnpj = cnpj.Substring(8, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*5;

 

             calcCnpj = cnpj.Substring(9, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*4;

 

             calcCnpj = cnpj.Substring(10, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*3;

 

                       calcCnpj = cnpj.Substring(11, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*2;

 

             resto = (soma % 11);

             if (resto< 2)

                    resto = 0;

             else

                    resto = 11 - resto;

          

             digito = resto.ToString();

             tempCnpj = tempCnpj + digito;

 

             if (digito != cnpj.Substring(12, 1)){

                    raiseCX_INVALIDTAX.Create("E");

                    posicao = false;

             }

 

             // Valida segundo Digito

                       soma = 0;

             calcCnpj = cnpj.Substring(0, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*6;

 

             calcCnpj = cnpj.Substring(1, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*5;

 

                       calcCnpj = cnpj.Substring(2, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*4;

 

                       calcCnpj = cnpj.Substring(3, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*3;

 

                       calcCnpj = cnpj.Substring(4, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*2;

 

                       calcCnpj = cnpj.Substring(5, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*9;

 

             calcCnpj = cnpj.Substring(6, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*8;

 

             calcCnpj = cnpj.Substring(7, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*7;

 

             calcCnpj = cnpj.Substring(8, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*6;

 

                       calcCnpj = cnpj.Substring(9, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*5;

 

             calcCnpj = cnpj.Substring(10, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*4;

 

                       calcCnpj = cnpj.Substring(11, 1);

             calc = Numeric.ParseFromString(calcCnpj);

                       soma = soma + calc*3;

 

             calcCnpj = cnpj.Substring(12, 1);

             calc = Numeric.ParseFromString(calcCnpj);

             soma = soma + calc*2;

 

             resto = (soma % 11);

             if (resto< 2)

                    resto = 0;

             else

                    resto = 11 - resto;

          

             digito.Clear();

             digito = digito + resto.ToString();

             tempCnpj = tempCnpj + digito;

 

             if (digito != cnpj.Substring(13, 1)){

                    raiseCX_INVALIDTAX.Create("E");

                    posicao = false;

             }

       }

 

       returnposicao;


Remember to create the following object to use of the extended field for use principally on screen

 

Create a dynamic codelist from BCO and BCC

$
0
0

Hello folks,

 

Hope you are doing great.

I'm blogging here because I saw no material centralizing how to createa field with dynamic dropdown being completed and determined fromexisting valuesin other fields.


So, first of all you need to create a new object

 

And create a "Business Configuration Object":

 

Define the follow field like below:

 

Important: If you want to show in the screen the description you need to set in the "Field Type" the value "Description", if you do not set this value the information was displayed is the code.

 

The second step is to create "Business Configuration Set" like below:

 

I fill with the follow values (for only example):

 

The data create above will be our reference field with the reference data.

So now, we need to create the second codelist. We will repeat the above step for the follow data:

Import: set the second field in as the "Code" value in "Data Type"

 

Create the BCC of the BCO create above

 

Now we need to create the extended field in the BO.

For this example we will use the Customer Extended Object like below:

import AP.Common.GDT;

import AP.FO.BusinessPartner.Global;

 

[Extension] businessobject AP.FO.BusinessPartner.Global:Customer {

 

        // You must activate this business object before you can access the extension fields

        // or messages in script files, forms, and screens.

           

             node AddressInformation {

           }

  

             node Common {

                    [Label("Estado")] element Estado:DEVESTADOCode;

                    [Label("Cidade")] element Cidade:DEVCIDADECode;

           }

  

             node CurrentEmployeeResponsible {

           }  

}

 

Note that element type is the codelist create from BCO.

Save and actvate all objects.

 

In the final step we will create a Embedded Component to insert these fields.

So, go in the Create new item and select the follow option

 

After create then in your Solution use the double click and open then.

First of all, you will select in the "BO Browser" and field created in the Extended Business Object like below:

 

 

Drag and drop to the screen in the Design tab

 

Now go to the Data model tab

 

Create Data structure like below

 

 

Add the field

 

 

Drag and Drop the fields from Select Elements and your Data Model will be stay like below:

 

Now go to the Controller tab

 

In the Inport folder create a new Inport like below

 

Add new parameter and configure then like below:

 

So, with the Inport selected go to the "Properties" tab

Set the "RequestFireOnInicialization" to value "True"

 

And create a new event in the "OnFire" events

 

In the event "Inicialize" configure something like that:

 

For your referente to create one EC follow the video:

Cloud Application Studio: Add Facet (Embedded Component) based on BO Extensions - YouTube

 

In the last step, back to the "Data Model" tab and click with the right button in the "Cidade" field (the field that will be used from referenced field).

Select the option "Codelist Context Mapping"

 

Select the field used in the BCO referenced with the field Extended Bussines Object

 

Click "OK".

 

Now Save and Activate your EC (Embedded Component).

 

The process is done.

In your test the field "Cidade" will bee populate from data selected in the "Estado" field like below:

 

 

 

Regards

Deprecation Analysis of

$
0
0

Dear SAP Cloud Application Studio Developer,


you might got recently an email titled "Deprecation analysis of" <your add-on>.


The deprecation check behind the email was executed for all productive tenants. With this check the syntax check of the SAP Cloud Applications Studio in regards to deprecation messages was made visible - also for add-ons which havn´t been touched for a while.


As you might know the deprecation mechanism of the public solution model is in place to enable SAP to continue with new -potentially incompatible- developments and to set the old entities to deprecated - indicating that they should not be used anymore and if possible their usage should get replaced.

New features might be only supported for the new objects created. So whenever possible a replacement of the usage of deprecated objects makes sense.


However the analysis of the productive tenants has shown that not all issues indicated by the syntax check are really places requiring a replacement. In most cases the old functionality continues to run - because the Developers have built everything in a compatible way, supporting also the old entities. For example datatypes cannot be replaced in a add-on -  they will be always supported in the future.

(In the email was Mewritten: "It is recommended to replace this usages with the suggested alternative." - not immediate action required.)


In most cases you find more information about the deprecation in the online help.

 

It is intended to provide here some more information about frequently asked questions.

 

The email was send out on the 25th May it will be send out again three weeks later (should be 15th of June). After this it will be sent out only once per release. The email text will be in the future more like "For your Information".

 

Best regards,

Stefan Kiefer


------------------------

Update as of 06.07.2016


As more requests come in I see that three things cause confusion:

1. The report had a small bug with big effect in: if in a tenant are multiple addons - the deprecations of addon 1 were also listed in addon 2 ... and so on.

Programmers know what the cause is: forgot to clear a variable in the next iteration. However for every entry the full path is listed - so it is easy to identify to which addon the line belongs to and ignore the others.


2. My assumption was wrong that the Studio syntax check shows the same messages - so that they can be displayed again by every AddOn responsible in the studio. It turned out that this assumption was wrong. I don´t know what mode I was in when I got the messages in the Studio - but I was not able to see the messages in the Studio in my recent tries.


3. Some Messages address situations which cannot be resolved. Exchange of a Datatype for example or to remove something which is not possible because the solution is assembled.



Considering all this at the moment the email can only be rated as a hint. For some situations it might be useful to replace some usage of a standard object by a newer one. Generally it cannot be expected from Cloud Studio developers that these places are changed - unless the studio does not throw this as error messages too.


So I am a bit sorry having caused confusion with this service. However from SAP side we have now visibilty on the various deprecation cases what we did not have before. So we will try to improve the email in order to point out only real valuable information in the future.

As already mentioned above there is no urgent action required.

Reactivation openSAP course: Application Development for SAP Business ByDesign.

$
0
0

Since early days on SAP Business ByDesign development, it was clear for the architects, about such as game changing factor from ecosystem developers,  would play along for its product success. Nowadays, it seems obvious for many of us who worked since eraly days on this area. But even better, many of our customers and partners, take advantage of services such as extensibility, integration, adaptation or business configuration as key drivers for customer adoption and new market introduction.

 

Building on that principle, the Partner Development Infrastructure and the Cloud Application Studio remain an strategic investment pillar in conjuction with the ecosystem. On this line, the goal to facilitate the enablement, as much as we can, to any consultant, developer or IT expert remains important to us.

 

Today with this openSAP reactivation option, you can take this online course and get a new opportunitty to complete the weekly assignments and final exams in order to earn a Record of Achievement at your own speed. Obviously, for those partners who previously earned such Record of Achievemnt in this course, you may not consider.

 

This first reactivation course at SAP ByD implies some logistic effort at our side.Thus, once registred, you will receive a code and further details which will allow you to reactivate this course and opportunity to complete the assignments and final exam.

 

Today we have about 2000 extensions deployed on productive customer with an amazing growth on add-ons! It is very inspiring to realize how developers at ISV partners contribute to extend the product and such continuous innovation effort on SAP ByD is by large verified.

This enablement training should be instrumental to new partners and customers as it is for our installed based today.

 

Take the advantage!

A2A Internal Communication : Full Guide

$
0
0

Hi everyone,

 

I thought it would be nice to post something about this topic, which is very useful but still a mystery for some. I recorded a video (tutorial + explanation) about what you can do with an Internal Communication.

 

Internal Communication

 

The purpose of this blog is to collect all relevant information, tips, good practices, workarounds, etc. regarding Internal Communication.

You can find more details about it in the complete reference here.

 

Feel free to ask or share about IC!

 

Hope this helps.

Addressing the n+1 landscape dilemma - Approach 1

$
0
0

Approaches to address continuous development on objects across releases


This blog series talks about the scenarios in C4C where you need to address the following:

 

- Phase 1 (P1): You have a Dev -> Test -> production landscape where you have done PDI developments on Dev, tested on Test, deployed it to Production and gone Live

 

- Phase 2 (P2): Now, you need to fix bugs that arise in the P1 environment, plus you also need to do additional NEW developments to

     - SAME objects that are there in P1

     - NEW objects introduced only with P2

 

This can be a challenging scenario depending upon how your project was structured – in terms of identifying the release timelines and gaps between the releases, the extent to which you do ‘continuous’ development across releases on the same objects, if you are mainly introducing new functionality only with new objects, and the time you allocate to tests and bug fixes before starting with a new release


Also, do remember that in general, PDI should be used for the Last Mile scenarios and development. For full fledged development, you can use HCP

 

Typically, we have the standard 3 tenant landscape as such:

Develop on Dev1, deploy to Test1 for testing in an Integrated environment and deploy to production


1.png

The approaches that we discuss in this blog series depending on the scenario are:

 

1. Do limited Production Bug Fixes directly, and continue development on standard 3-tenant landscape

2. New BOs created in the existing solution

3. The SWITCH option

4. Commenting out code


 

Approach 1: Recommended: Do limited Production Bug Fixes directly, and continue development on standard 3-tenant landscape (this blog)


Scenario: In Phase 2, you may have a situation where there are bug fixes that can be identified by a concerted early test process

Objective: In this approach, the objective is to address the bug fixes early on, BEFORE we get into a Phase 2 development cycle. In case still needed, for Very High issues, address them directly on production

 

  • Continue with the existing 3-tenant Landscape
  • Perform thorough testing in the Test tenant to identify and fix issues early
  • After Go-Live, have another – at least 2 weeks’ window to see if any high prio issues are reported and fix them with the normal patch process. Most critical issues are generally reported in this window and get corrected early on. Till this time, no new developments should start for Phase2
  • Later, during the course of Phase2 development, if there are still urgent and Very High Prio issues, fix them (absl only) directly in the production tenant using the Production Bug Fix Feature, and double maintain that code in the Dev tenant. Hence, no further patches are introduced for taking care of bug fixes, instead we did direct production bug fix only. The next patches will only come with Phase 2 completion.
  • With 1608, you now have the production fix in the Pre-production tenant as well so that you can test it before doing this on the production. For details, you can refer to the PDI rollout of 1608
  • All other issues with a lower priority can get clubbed with the Phase 2 release/go-live

 

The tenant landscape doesn’t change in the above scenario. With the same 3 tenant landscape, Phase 2 new development also comes in as a regular patch as well as low priority issues of Phase1 that didn’t get addressed through the Production bug fix.

 

In general, if there is a lot of new functionality for a subsequent release, it is more advisable to have a shorter release in between to ensure that bugs to be fixed get minimized for that release, since lesser functionality gets introduced. Also, the fact that testing and correcting a huge chunk of development can be reduced and be made more effective


Below: Production Bug Fix Method explained

2.png

3.png

Approach 2: When New BOs are created in the existing solution (Next Blog) Addressing the n+1 landscape dilemma - Approach 2

 

This is a 4 part blog series with the following links:

Addressing the n+1 landscape dilemma - Approach 1  (This blog)

Addressing the n+1 landscape dilemma - Approach 2  (When New BOs are created in the existing solution)

Addressing the n+1 landscape dilemma - Approach 3  (Switch Option)

Addressing the n+1 landscape dilemma - Approach 4  (Commenting out code)

 

With contributions from Pramodh PatilStefan HagenSridhar Natarajan


Thanks

Vinita


Addressing the n+1 landscape dilemma - Approach 2

$
0
0

In the previous blog Addressing the n+1 landscape dilemma - Approach 1  we talked about the approach using direct production bug fixes. In this Blog, we talk about the approach when the next wave of development is for New Objects only

 

 

 

Approach 2: New BOs created in the existing solution


Scenario: Your next wave of development is essentially only for new objects


Objective: Achieve differentiation by segregating the NEW BO using Business Options

 

 

In the existing PDI solution, if a new BO is to be created, it can be linked to a separate Business Option (provided a Business Topic was created for the existing PDI solution), thereby allowing the user to scope this in the Production tenant only when Phase 2 goes live. Until then, continue to test in the Test Environment. Here again, one can continue with the existing 3-tenant landscape since the new BO development needs only additional scoping when ready in the Production tenant. On the existing Dev and Test environment the new BO and UIs would be scoped of course but doesn’t affect or interfere with the existing code (assuming there is no cross referencing of new BO in existing code of course)

 


ABSL Code: Since this is linked to the new Business Option, the code wouldn’t be visible unless scoped in the Production tenant


New WoC/UI: Since this is linked to the new Business Option, the Uis and new WoC wouldn’t be visible unless scoped and assigned in the Production tenant


New Custom UI associated with the standard screen: Since this is linked to the new Business Option, the additional enhancement related to the new BO wouldn’t be visible unless scoped and assigned in the Production tenant

 

This is a 4 part blog series with the following links:


Addressing the n+1 landscape dilemma - Approach 1  (Production Fix)

Addressing the n+1 landscape dilemma - Approach 2  (When New BOs are created in the existing solution) - This Blog !

Addressing the n+1 landscape dilemma - Approach 3  (Switch Option)

Addressing the n+1 landscape dilemma - Approach 4  (Commenting out code)

 

 

With contributions from Pramodh PatilStefan HagenSridhar Natarajan

 

Thanks !

Vinita

Addressing the n+1 landscape dilemma - Approach 3

$
0
0

In the previous blog, Addressing the n+1 landscape dilemma - Approach 2we talked about the scenario where only new objects were being created in the new phase of development. In this Blog we talk about using the SWITCH option


Approach 3: Switch Option

Scenario: The more complicated case where - In Phase 2, you need to fix bugs that go to production, while you might still be doing additional development on the same object that is not yet complete and is not in a state to be made available on production. Since there is only ONE development tenant on which you maintain your code, understanding how to take care of the above challenge is key

 

Objective: Although the code is still the same across both environments, achieve differentiation by using switches and hiding what is not needed on Phase 1 environment

 

There may be a variant of the 3 tenant landscape if in Phase 2 you want a dedicated separate test environment to test fresh development related to Phase 2 and not see it on production till it is ready (Lets call this the n+1 landscape). This is shown below – the Dev tenant remains the same since you cannot have more than 1 Development tenant, you also have the option for an ‘additional’ test tenant which will now have code developed for Phase 2 and tested in a separate testing environment meant only for phase2. Of course the Phase 2 related code is NOT meant to be visible or testable on the Test1 environment nor in production

 

Remember: Dev tenant is ONE. All code enhancements -> be it for bug fixes or additional development would get deployed to both the test tenants. You cannot have multiple code lines for the 2 phases in separate tenants and think of merging these later. What you can do is hide in that single codeline what is not needed yet in the Phase 1 environment.

 

  We need to look at all components of development though -> Absl, UI elements through KUT or PDI within/without Embedded components, Data Sources and Extension scenario


4.png

In the figure above:

 

In Phase 1, Development happened on Dev1, tested in an Integrated PI 1 environment on Test1, and deployed in production

 

In Phase 2, Development continued on Dev1, and tested on a phase 2 specific integrated Test environment Test2

 

There are 2 situations now in Phase2:

Bug Fixes -> Should be there in Phase 1 environment + Phase 2 environment

New development -> Should be there only in Phase 2 environment

 

But HOW do we make this differentiation when the coding tenant is just ONE, the object for the bug fix as well as new development is also the same, or when we have a completely new object introduced only in Phase2

 

What you can do to achieve this differentiation is:

  • Create a BC Fine Tuning Option to denote: Phase1 or Phase 2. Depending upon what this is set to, you can control the absl code.

 

  • ABSL Code: For all new code written for Phase2, precede this with a check for Fine Tuning Value Phase2. Hence, this code gets executed only if the FT value is Phase2

 

  • New Extension Fields through KUT and PDI: UI: Make this visible inproduction only in Phase 2, by controlling the visibility on UI via Page Layouts + Business Roles

 

  • Visibility of PDI fields on Data Sources and Process Extensions: Extension fields cannot be hidden if they were introduced in PDI.

 

  • Visibility of KUT fields on Data Sources and Process Extensions: This can be controlled by not enabling this field in the “Further usage”


This approach, as well as any Integration challenges would be explained in detail in another upcoming blog by Pramodh and Sridhar


In the next and final part of this series, we talk about commenting the code approachAddressing the n+1 landscape dilemma - Approach 4


Addressing the n+1 landscape dilemma - Approach 1  (Production Fix)

Addressing the n+1 landscape dilemma - Approach 2  (When New BOs are created in the existing solution)

Addressing the n+1 landscape dilemma - Approach 3  (Switch Option) - This Blog !

Addressing the n+1 landscape dilemma - Approach 4  (Commenting out code)

 

 

With contributions from Pramodh PatilStefan HagenSridhar Natarajan


Thanks

Vinita

Addressing the n+1 landscape dilemma - Approach 4

$
0
0

In the previous blog Addressing the n+1 landscape dilemma - Approach 3  we talked about using the SWITCH option. In this blog we talk about the final approach, Commenting out the code

 

Approach 4: Commenting out Code and deploying the fix


For urgent fixes, it is also possible to comment out the newly developed code and then deploy the fix.

 

Just like with Approach 3, for the PDI and KUT extension fields on the UI, Data Sources and Process Extension Scenarios, the same restrictions would apply here as well

 

Addressing the n+1 landscape dilemma - Approach 1  (Production Fix)

Addressing the n+1 landscape dilemma - Approach 2  (When New BOs are created in the existing solution)

Addressing the n+1 landscape dilemma - Approach 3  (Switch Option)

Addressing the n+1 landscape dilemma - Approach 4  (Commenting out code) This Blog !

 

 

With contributions from Pramodh PatilStefan HagenSridhar Natarajan

Thanks !

Vinita

Creating a HTML Mashup with a Dynamic URL

$
0
0

In this tutorial, I will explain how to create an Embedded Component (EC) that takes a dynamic URL as a parameter and display it on a mashup within the EC. This means the same Mashup can potentially be used to display a number of different web pages as the dynamic URL can be generated using ABSL, retrieved from another BO, etc.

 

Create your solution and add a Port Type Package (PTP).

image001.jpg

 

I will add it to the Customer Business Object for this example. (Displays from 1605 onwards)

image002.png

The PTP will act as the ‘middle man’ between the Floorplan that will contain the EC and the Mashup itself.

Open the PTP in the UI designer and set it up as below.

image003.jpg

 

Add a Port type and a Parameter called DynamicURL as shown above.

 

Next we add a Port Binding to our solution.

image004.jpg

It will appear in the Project Explorer window.

image005.jpg

Open the PortBinding and change the settings as below.

image006.jpg

 

Notes:

You can select whatever Category you wish.

The Inport Type Package we have used is the PTP we created in the previous steps. Our Inport Type Reference is the parameter we added to this PTP.

Next we add a HTML Mashup.

 

image007.jpg

image008.jpg

It will open the following screen

image009.jpg

Select the Mashup Category we assigned to the PortBinding in the previous steps. Our custom PortBinding should then appear.

image010.jpg

Here is the trick to use a dynamic URL. We then enter the URL as http://{DynamicURL}. The {} denotes a variable and once you click on extract parameters the table below is populated. You could potentially have as many of these as you like (maybe for URL parameters), but for this example one will be fine. You can also adjust the height(in pixels) of the Mashup here. Save the Mashup once complete.

Note that I am using http here for display purposes. You will need to use https url’s for live projects as most Web Browsers will not allow http content in a https page (e.g. C4C) for security reasons.

Now the Project Explorer window should look something like the following.

 

image011.png

Now that the Mashup is setup, we will proceed to building our EC and adding it to the Customer TI. You can also add the EC to whatever BO you wish even custom ones.

 

image012.png

Add the EC like above and open it in UI designer.

 

Find your Mashup from the Configuration Explorer.

image013.png

 

Drag it to your EC.

image014.jpg

 

Create an InPort Stucture on the Data Model with a parameter for DynamicURL on the EC as below.

image015.png

 

For testing purposes, we will give the parameter an initial value.

image016.png

Create the InPort

image017.jpg

 

Create an Outport for the Mashup and select our PTP created earlier.

image018.jpg

 

Assign our DataModel InPort as the Parameter Binding.

image019.jpg

 

Next we need to set the binding on our Mashup from the EC.

image020.jpg

Click Bind and create a Navigation Configuration as below.

image021.jpg

 

Bind the Ouport field DynamicURL of the EC with the Inport DynamicURL of the Mashup.

Add the EC to a Standard or Custom BO, and you can bind the Inport (DynamicURL) of the EC to the Outport of your BO that contains your url.

 

Once you fire it, we should see something like the following.

image022.jpg

 

Hope this helps somebody!!

Problem with New Feature Mandatory error message for OnSave validation

$
0
0

Hello community,

 

If you've read the What's New on SAP Cloud Applications Studio 1608, you probably seen this.

 

Mandatory error message for OnSave validation

The system now displays an error message to the developer when a Save is rejected. There is an enhancement in the OnSave validation framework to check if any custom error messages are generated in the OnSave event while it returns false value. If not, the system displays a technical error message to explain which solution is responsible for the rejection.

 

 

Well, the problem is that you may have developed solutions and have raised some messages on the BeforeSave to improve performance or other specific reasons. In this case you've probable used a Transient flag to check if the object is valid or not. Then on OnSave you've just returned this flag, blocking then the Save.

 

In this scenario you will probably find an error message similar to this.

automatically-message-onsave.png

 

In order to remove this message, you will have to raise an ERROR message on the OnSave event. You can use something like this.

 

automatically-message-onsave-CODE.png

 

DONE! Your result should be similar to this.

automatically-message-onsave-FIXED.png

Pre-requisites for creating a PDI related incident at SAP Support

$
0
0

This blog is just to outline the data that you need to provide while creating an incident with SAP support. This will help to avoid the delay in the processing of incident due to the lack of information and thereby can reduce the unnecessary back and forth communication.

 

  1. Please create incident from the tenant where the issue occurs, in case it is reported an issue from a Production tenant and this feature is working in your test or development tenant, please provide details of the other tenant where it behaves correctly.

 

   2.  Solution name and Customer name.

 

1.jpg

 

3. Detailed steps to reproduce the issue from the UI/SDK along with any supporting information like:

 

          a. Details of object involved in error. Eg. UI, BO, scripts, actions etc. Path in the SDK where these are developed with screenshots.

          b. Screenshots

          c. Error messages

          d. Request and response XMLs for web service issues

          e. In case of a suspected scripting error, provide implementation details, what do you expect to be executed? For example, if an action is being clicked on the UI, should any script get executed, is any internal communication getting triggered, is any web service getting called etc.

          f. Please provide the UI Designer configuration ( query details , data modeling , extensibility explorer ) along with the screenshots.

 

 

  4. Permission to reproduce the issue at SAP support end , if it includes any operations that changes the data like Save, Execute, etc.

 

  5.  Contact details in the incident so that support engineer can directly contact you in case of any claritication required.

 

Provide these information whenever you create an incident for a better incident processing experience

 

Thanks and Regards,

Dhanya KV

SAP Cloud Application Studio (SDK) : Tips and frequently asked questions

$
0
0

The SAP Cloud Application Studio FAQ will help to answer questions that come up repeatedly or will resolve common issues.

 

The FAQ includes textual recommendations - and are pointing to discussions in the SAP Cloud Application Studio Forums, Blogs, Articles, etc.


Some useful blogs and links to get you started with the Cloud Application Studio

SAP Cloud Applications Studio

SAP Cloud Application Studio Performance Best Practices

SAP Cloud Applications Studio Deployment & Landscape Basics

 

 

Best Practices for Scripting

 

The Cloud Application Studio has some tools that will help you analyze an issue arising out of the solution. Please consider using these before raising an incident.

 

  • Dump Analysis tool - You may check the dump analysis tool first for any dump encountered from the Cloud Application Studio solution. Dumps arising from the Cloud Application Studio solution are listed here. To use the dump analysis tool in the SAP Cloud Application Studio go to the View -> Dump Analysis option.
    • For more information related to this tool you may refer to the help documentation in the SDK - Developer Desktop -> Dump Analysis

 

 

  • Debugging - SDK offers you the capability to debug your solution.
    • For more information related to debugging you may refer to the help documentation in the SDK - Developer Desktop -> Debugging

 

  • Tracing - Use this feature to print out information from your solution as it is running. It can be started from Debug -> Windows -> Trace Explorer.
    • For more information related to tracing you refer to the help documentation in the SDK - Developer Desktop -> Tracing

 

  • Performance Tips - To find improvements of code performance use the Performance Tips option. To view it, select the .absl file in the Solution Explorer -> right-click and select Performance Tips. This will list all the lines in the .absl script where code changes could lead to performance improvements.
    • For more information related to this option you may refer to the help documentation in the SDK - Business Objects -> Business and Technical Background

 

 

What needs to be provided to SAP Cloud Product Support before opening an incident related to a Cloud Application Studio issue?


Please check the following Blog which states what SAP Cloud Product Support needs to process Cloud Application Studio incidents smoothly

Pre-requisites for creating a PDI related incident at SAP Support

 


What can you do to improve Mass Data Run Object's (MDRO) performance?


To improve the performance of a MDRO you could do the following:

  • Check if parallel processing is checked
  • Check performance tips on the corresponding Action script of the MDRO
  • Check if you can make changes to the query of the BO on which the MDRO is created to reduce the result set

 

 

You have encountered a dump in the UI for an Internal Communication. Why did it occur?

 

There are several issues which can lead to a dump in an Internal Communication. In the following there are some:

  • You have mapped an element from a custom Node which was added to a standard Business Object. In the current System Release it is not possible to select elements from custom created Node which belong to standard Business Objects.

 

 

Which queries are supported in Cloud Application Studio and which query should you use for your use case?

 

Please check the following Blog which describes all the different queries in the Cloud Application Studio and when to use which one:

Some Thoughts about Queries




Lifecycle Management of the Extension Solutions Built Using SAP Cloud Applications Studio

$
0
0

We would like to invite Partners at SAP ByD and C4C to the upcoming  webinar education session about "Lifecycle Management of the extension solutions built using SAP Cloud Applications Studio" on 19th of September (15:30 CET). Please use the registration button to reserve your seat now!  Any attendee may have an advance knowledge level on Cloud Application Studio.


If registered attendees have any specific question to be addressed at this session, please contact Antonio.sanchez.coullaut@sap.com


Agenda will address the following subjetcs durring the 90 minutes session:

  • Lifecycle Management overview
  • Features
  • Best practices/ Recommendations
  • Recent features, Roadmap
  • Q&A


Looking forward your participation!

Quality review missing for MCS solution.

$
0
0

Dear Users,

 

You might receive an error message while activating a Multi -Customer Solution.

 

"Download for Delivery to Customer is not possible.Quality review missing"

 

Steps to Reproduce:

 

  1. Go to Cloud Application Studio.
  2. Open the required Multi- Customer Solution.
  3. Go to Implementation Manager.
  4. Click on Activate or Download for Delivery.
  5. Below error receives.

 

Quality review.png

 

 

Cause:

 

As per standard MCS process, the created Multi- Customer Solution has to be reviewed by SAP ICC team before it can be download  for customers.

This is expected behavior.


Resolution:

 

You need to contact SAP ICC team OR SAP Sales person OR Partner Coach. They would be able to provide you more on this.

 

For additional quires on MCS processes you can refer the below thread :

 

http://scn.sap.com/thread/3845925

 

Regards

Anant

One of the common error while deploying a Solution.

$
0
0

Dear All,

 

If the deployment of your solution is failing because of an error " Error when processing service provider of component AP-RC-ANA-DT", then you can refer this blog to resolve this issue.

 

Steps to Reproduce -

 

  1. Login to Cloud Application Studio.
  2. Open your required Solution.
  3. Go to Implementation Manager tab.
  4. Click on Activate.
  5. Error pops up.

 

DT.png

Cause -

 

User is not assigned to Business Analytics Work Center and hence activation of Analytics content is failed.

 

Resolution -

 

User should perform the below steps:

 

  1. Go to Application and User Management Work Center.
  2. Select the Business User.
  3. Click on Edit then Access Rights.
  4. In the tab Work Center and View Assignment, click on button find.
  5. Search for Business Analytics Work Center.
  6. Select the check box.
  7. Click Save and Close.

 

After performing the above steps, Activation of the solution will not fail for this error message.

 

Regard

Anant

Develop Better: New ISV partner relevant webinar about Introduction to Web Services

$
0
0

I would like to make aware of the next webinar within our series of Develop Better with SAP Experts webinars for ISV partners. On 29th of August and 1st of September we offer a new webinar related to "Introduction to Web Services".

 

This webinar is related to the following detailed topics:

  • Introduction to Web services
  • Standard Web services
    • Where can I get help? (explain the documentation)
    • Read a specific customer
    • Creating a new customer
    • Updating an existing customer
    • Querying the entire customer database
  • ODATA Web Services
    • Create Odata service in the Odata service explorer
  • Consuming External web services via PDI
  • Creating web services for custom objects via PDI

 

If you are interested in joining one of the sessions or if you would like to share this information with other relevant stakeholders, here are the related registration links:

 

https://partneredge.sap.com/en/library/education/products/entManage/byd/implement/e_oe_te_sbbbd_20975.html

https://partneredge.sap.com/en/library/education/products/entManage/byd/implement/e_oe_te_sbbbd_20976.html

 

In relation to these webinars I would like to also make aware of the existing summery recording link of previous Develop Better Webinars:


https://partneredge.sap.com/en/library/education/products/entManage/byd/te_ep_byd_app_dev_series.html

 

 

Recreate or Copy Screens from XML

$
0
0

Hi Everyone,

 

Sometimes, we may face a situation to recreate screen from one solution to another solution, where as porting and solution template upload replaces entire solution. We know for each fields in the UI designer has corresponding XML tags generated already which is further used to interact with SAP servers. So it is possible to regenerate the UI from XML. The request from each UI might be separated by the namespaces,Unique ID's, PTPs etc,as long as we are able to distinguish them from source and destination it is easy to manipulate screen from XML.

 

This blog is for copy screen from one solution to another solution without porting.

 

Lets begin the implementation part.

 

Here is my source solution and Destination solution


1 Source Solution Structure.jpg2 destination solution sturcture.jpg

 

 

I've created the Business object in same name(FabricOrderDesigns) as in destination and I have created the screen in destination solution.


3 create screen.jpg

 

 

Before copying from my source, my screen will be looking as below.

 

before owl.jpg

 

   1. Click the screen and press enter, XML mode of the UI component will appear.

5 destination XML mode.jpg

 

   2 . Now open the Find and Replace(Ctrl + F) Window and search namespace, you will get the URL of the solution, this URL can be seen under many tags and values such as designtimeMainBOEsrNamespace, EsrNamespace etc, Copy the namespace and store it  somewhere temporarily

 

6 copying destination namespace.jpg

 

   3. Next Step is to copy the path of PTPs, Since the PTP differs from source to destination, we have to give proper path.

 

7 copying portype package.jpg

 

   4. This is step is not mandatory, if you need export to Excel option you can do this or you can simply delete the Event Handler tag to avoid errors.


8 copy excel.jpg

 

   5. Once you copying from the destination UI's XML, repeat the same for the source UI's XML and store it temporarily like below

temporary copy.jpg

 

   6. Now copy everything from the source UI and paste it in destination UI.

 

9 copy from source.jpg10 paste in destination.jpg

 

 

   7.Now we have to replace(Ctrl + F, quick replace) the namespaces, PTPs and Excel(not mandatory) in the destination UI as below

 

11 replace namespaces.jpg12 REPLACE PTP.jpg13 replace excel.jpg

 

 

   8. Once you done replacing close XML mode of the UI and save

 

save the doc.jpg

 

   9. Since changes in the XML mode temporarily locks the UI, you have to delete the session to proceed further

 

16 delete the session.jpg

 

   10. Now Open the OWL in UI Designer (Right Click and Open In UI Designer or Simply double click). You can see the UI is copied as in source, now just save and activate the UI, it will work in most cases, unless the current UI may refer any other objects.

 

18 new owl.jpg

 

    11. For QA, OIF, GAF, etc we may use our custom OVS for that we have to copy the location of the OVS and paste in the destination UI. Rest of the steps are same (Copy & Replace namespaces, PTPs,,..)

We can get the Path from Properties tab

 

21 ovs properties.jpg22 copy path.jpg

20 search ovs.jpg

 

Thats all about Screen Copy.

 

Some Thoughts:

  1. I keep the BO names are same, since different names of BO leads to change a lot in XML.
  2. Before copying Screen, its necessary to create corresponding Query, BC Sets, OVS etc
  3. UI of PTP are usually sames for both source and destination, if not we should copy and paste that too.
  4. The XML copy is possible only for custom UI not standard one.
  5. Deleting Session is mandatory on Step 9, if the session is locked, the UI designer will not open for edit.
  6. If any component in the UI refers to the unknown component, it is difficult to activate as in step 10, at that time we have to fix the component in XML or UI designer, or we delete that component and recreate it.

 

Regards,

Senthil

Viewing all 81 articles
Browse latest View live