Antony Reynolds

Subscribe to Antony Reynolds feed
Oracle Blogs
Updated: 13 hours 27 min ago

Customizing OIC Integrations

Tue, 2020-08-18 01:23

Often when creating an integration we have a number of constant values used in maps, expression editors and conditional expressions. Although they may be constants, sometimes they need to changed. For example we may have an email list that is passed in to a notification activity. As we promote the code between environments we may need to change this "constant" email address.

Introducing Integration Properties

OIC has a new feature in the August 2020 release called Integration Properties. It allows a developer to define a property that is scoped at the integration level and can be updated without having to edit the integration itself. Read more in the Oracle Integration blog entry Integration Properties.

Using Lookups as Global Properties

Integration Properties make it very easy to customize a single integration but sometimes we need to use the same property in multiple integrations. If we use Integration Properties for this we need to edit the property in each integration when it changes. An alternative approach when we have a property that is used in multiple integrations is to store the values in a lookup table.

Properties in a Lookup Table

We store the properties in a lookup with two domains as shown in the picture.

One domain is used to hold the property names and the other domain is to hold the associated value. Within an integration we could use the lookup function to retrieve property values, but that requires going through the lookup wizard each time we want to use the property.

Property Retriever Integration

To simplify accessing the properties we can create an integration that returns a json document that includes all the properties from the lookup table. A sample json is shown below:

{
  "ErrorEmailList" : "sampe@mail.com",
  "BaseFileDirectory" : "/some/directory"
}

In the integration we create a map to set the json values using the lookup function. This way we implement a single lookup function for each property, making this the same amount of work whether we use the property once in a single integration or 10 times in a 100 different integrations.

The map which uses the lookupValue function is shown below:

Consuming Properties

In integrations that need to use the global properties, the first activity in the integration should be a local invoke to call the property retriever integration. The properties will then be available in all maps as shown in the picture below:

When to Use Integration Properties and Lookups

So when should Integration Properties be used and when should lookups be used?

Integration Properties should be used when the properties are only needed in a single integration. If the same properties are required in multiple integrations then lookups should be used.

Summary

Integration Properties provide an excellent way to customize a single integration while lookups can be used in conjunction with a wrapper integration to provide a mechanism for properties to be shared across multiple integrations.

Here is a sample property retriever integration.

Connecting to IDCS from Oracle Integration

Mon, 2020-07-27 12:35

Oracle IDCS is the core identity management system for many of Oracle PaaS services and can also be used with Fusion Applications. As such it is often useful to be able to manipulate identity information from within OIC. In this post I will show how to connect to IDCS from within OIC. One use case for this was the training that Oracle Integration PMs deliver. We create unique user ids for each training, and also have a need to unlock user accounts during training. By creating integrations to do this we were able to automate the provisioning of users and a simple VBCS app could be used to unlock user accounts.

Overview

There are basically three steps invovled in interfacing to IDCS from OIC.

  1. Create an IDCS Application that will be used to provide OAuth client credentials to the integration.
  2. Create an OIC Connection to IDCS using these credentials.
  3. Create the integration using the Connection to do what you want with IDCS.

Skip to the bottom of this blog for a summary of the settings required for each of the above activities or read through for details of how to configure it step by step.

Creating an IDCS Application

To create an IDCS application we follow the documentation instructions.

Login to IDCS and go to the IDCS dashboard. From there we can click on the plus icon in the top right corner of the applications panel.

There are a number of different types of application. We need to create a "Confidential Application" from the the IDCS dashboard.

We give our application a name and then move on to the next screen.

We can then choose to "Configure this application as a client now", select grant type "Client Crendetials" and add a role to allow administering of the IDCS. Note when choosing a role be sure to follow the principle of least privilege. The "IDCS Domain Administrator" shown in the screenshot is usually too powerful for what you want to do, so choose a role with lower privilege levels. The AppRole Permissions page in IDCS documentation will help you choose a role with least privilege for your required operations.

There are no resources needed to be protected so we can skip the "Expose APIs to Other Applications" screen. We also don't need to enforce grants as authorization so we can "Finish" adding our IDCS application. We will be rewarded with a popup screen showing the client ID and client credentials required to request an OAuth access token.

After activating the application we are now ready to start using it. We can use Postman to verify that we correctly configured the IDCS application.

Create an OIC Connection

We will be using the IDCS REST API so we need to create a REST connection in OIC.

We will call it "IDCS REST" and mark it as an invoke role only as we will not be implementing the IDCS API ourselves, just calling it.

We now provide the endpoint configuration. We set the following fields:

  • Connection Type to be "REST API Base URL"
  • Connection URL to be our IDCS URL (https://idcs-?????????.identity.oraclecloud.com)
  • Security Policy to be "OAuth Client Credentials"
  • Access Token URI should be based on our IDCS URL with a path of "oauth2/v1/token" (https://idcs-?????????.identity.oraclecloud.com/oauth2/v1/token)
  • Client Id as obtained from our IDCS application.
  • Client Secret as obtained from our IDCS application.
  • Scope should be "ourn:opc:idm:__myscopes__"

We can now test our connection to verify that it works.

Creating an Integration to Query IDCS

We will create an integration to return all the users in an IDCS instance. We will begin by creating an Application Driven Orchestration.

We will call it "List IDCS Users" and put it in a package called "idcs.sample".

I used the "Sample REST Trigger" that ships with OIC as the trigger to my integration.

I called the trigger "GetUsers" and left the multiple resource/verbs blank as I am only going to implement a single resource/verb in this sample.

I set the URL to be "/users" but I don't need any headers or parameters. I do however want a response.

I will return the following fields in my response:

  • DisplayName
  • Username
  • Lock Status
  • Active Status

This translates to the following json:

[
  {
    "Username" : "TheUser1",
    "DisplayName" : "Friendly User 1",
    "Active" : true,
    "Locked" : false
  },
  {
    "Username" : "TheUser2",
    "DisplayName" : "Friendly User 2",
    "Active" : true,
    "Locked" : false
  }
]

The response type is automatically marked as json.

After verifying the configuration of the trigger I am ready to add an invoke.

We will create an invoke using the connection we created earlier.

We will call it "GetUsers" and configure it to call "/admin/v1/Users" and get a response.

We can get a response sample from the IDCS REST API documentation.

Other setting for the response are automatically completed for us.

We can then review the summary to ensure everything is as we expect.

With the trigger and invoke configured we can delete the map to the invoke as we are not parameterizing the query.

We can now perform the response mapping.

We set up tracking using the "execute" input as a tracking value as there is no input to this integration.

We can now activate and test out integration.

The output from the test is shown below:

Summary

In this blog post we saw how to create an integration to interact with IDCS. Key points are:

  • Create IDCS Application
    • "Confidential Application" Type
    • Grant Type "Client Credentials"
    • Remember Client ID and Client Secret
  • Create IDCS Connection in OIC
    • REST Adapter
    • Connection Type "REST API Base URL"
    • Connection URL "https://idcs-?????????.identity.oraclecloud.com"
    • Security Policy "OAuth Client Credentials"
    • Access Token URI "https://idcs-?????????.identity.oraclecloud.com/oauth2/v1/token"
    • Client Id from IDCS application.
    • Client Secret from IDCS application.
    • Scope "ourn:opc:idm:__myscopes__"
  • Create OIC integration
    • Invoke Path "/admin/v1/Users" for listing users

We showed listing users, but we could have created or modified settings in IDCS as well, depending on the permissions granted in the IDCS application.

Integration Cloud Who Are We - Using Headers to Identify IP Addresses

Sat, 2018-12-01 23:55
Identifying Integration Cloud IP Addresses

In this blog post I will show how to identify IP addresses associated with Integration Cloud.

The Challenge

When whitelisting services we need to know their IP address.  This is easy if we need to identify the inbound address to Integration Cloud, we can just resolve the Integration Cloud hostname to get the IP address of the Integration Cloud load balancer to be whitelisted for inbound traffic to integration Cloud.  This is useful if we need to whitelsit outbound calls through a firewall to Integration Cloud.  However for outbound traffic from Integration Cloud we need to do a little more.  This is needed when the target of integration Cloud invokes need to whitelist their caller.

The Solution Part #1 Using an External Service to Look Up Source IP

We will create an integration that calls a service that returns the calling IP address.  For outbound traffic from Integration Cloud this will be different from the inbound address.  There is a service call Ipify that returns the calling IP address.  This has a REST API that can be invoked to obtain the IP address.

To create it we create a new integration called Get Outbound IP.

The integration provides a simple REST interface to retrieve the outbound IP address.

REST Interface: Resource /outboundip : Method GET

Response Media Type: application/json

Response sample: {"ip":"129.157.69.37"}

We call the Ipify REST API as an invoke so that we can get the outbound IP address of Integration Cloud.

The IPIFY Connection has the following properties:

Connection Type: REST API Base URL

Connection URL: https://api.ipify.org

Security Policy: No Security Policy

The invoke has the following properties:

REST Service URL: /

Method: GET

Query Params: format

JSON Response Sample: {"ip":"129.157.69.37"}

The overall flow looks as below:

Calling this integration returns the IP address used by Integration Cloud for Outbound Calls.  This can then be used to whitelist calls to an FTP server for example.

The Solution Part #2 Implementing the Equivalent of Ipify

In Part #1 we used an external service to get the IP address of the caller of the service.  In this section we will implement an equivalent service in Integration Cloud.

In order to obtain the IP address of the client of Integration Cloud we need to access the headers set by the load balancer.  The Load Balancer as a Service used by Integration Cloud defines a special header X-Real-IP to provide target systems with the actual IP address of the calling system.

When defining the REST endpoint of our integration we need to define a customer inbound header.  This is done on the trigger.

Note we declare the use of customer headers.

We provide the basic endpoint information.

Then we are prompted to provide the name of custom header fields - X-Real-IP.

This is all that is needed to obtain the IP address of the caller of the integration.  The whole flow does not require anything other than mapping the header field back onto the response JSON as shown in the flow below.

Summary

This post has shown how using custom headers we can determine the IP address of the caller of an integration, providing the same functionality as the Ipify service.

The IP address of the caller may be useful for logging or other purposes, so being able to access it via the custom headers is a valuable tool in our arsenal.

Verifying White Listing for Oracle Integration Platform

Sun, 2018-09-23 18:33
Verifying Your White List is Working

A lot of customers require all outbound connections from systems to be validated against a whitelist.  This article explains the different types of whitelist that might be applied and why they are important to Oracle Integration Cloud (OIC).  Whitelisting means that if a system is not specifically enabled then its internet access is blocked.

The Need

If your company requires systems to be whitelisted then you need to consider the following use cases:

  • Agent Requires Access to Integration Cloud
  • On-Premise Systems Initiating Integration Flows
  • On-Premise Systems Raising Events

In all the above cases we need to be able to make a call to Integration Cloud through the firewall which may require whitelisting.

Types of Whitelisting

Typically there are two components involved in whitelisting: the source system and the target system.  In our case the target system will be Oracle Integration Cloud, and if using OAuth then the Identity Cloud Service (IDCS) as well.  The source system will be either the OIC connectivity agent, or a source system initiating integration flows, possibly via an event mechanism.

Whitelisting Patterns   Source Whitelisted Target Whitelisted Target Only No Yes Source & Target Yes Yes Source Only No No

Only the first two are usually seen, the third is included for completeness but I have not seen it in the wild.

Information Required

When providing information to the network group to enable the whitelisting you may be asked to provide IP addresses of the systems being used.  You can obtain these by using the nslookup command.

> nslookup myenv-mytenancy.integration.ocp.oraclecloud.com Server: 123.45.12.34 Address: 123.45.12.34#53 Non-authoritative answer: myenv-mytenancy.integration.ocp.oraclecloud.com canonical name = 123456789ABCDEF.integration.ocp.oraclecloud.com. Name: 123456789ABCDEF.integration.ocp.oraclecloud.com Address: 123.123.123.123

You will certainly need to lookup your OIC instance hostname.  You may also need your IDCS instance which is the URL you get when logging on.

Testing Access

Once the whitelist is enabled we can test it by using the curl command from the machine from which we require whitelist access.

> curl -i -u 'my_user@mycompany.com:MyP@ssw0rd' https://myenv-mytenancy.integration.ocp.oraclecloud.com/icsapis/v2/integrations HTTP/1.1 200 OK Date: Sun, 23 Sep 2018 23:19:44 GMT Content-Type: application/json;charset=utf-8 Transfer-Encoding: chunked Connection: keep-alive X-ORACLE-DMS-ECID: 1234567890abcdef X-ORACLE-DMS-RID: 0 Set-Cookie: iscs_auth=0123456789abcdef; path=/; HttpOnly ...

The -i flag is used to show the header of the response, if there is an error this flag will enable you to see the HTTP error code.

The -u glag is used to provide credentials.

In the example above we have listed all the integrations that are in the instance.  If you don't see the list of integrations then something is wrong.  Common problems are:

  • Wrong URL
  • Wrong username/password - pass them using single quotes to prevent interpretation of special characters by the shell.
  • Access denied due to whitelist not enabled - depending on the environment this may show as a timeout or an error from a proxy server.
Summary

As you can see gathering the information for whitelisting and then testing that it is correctly enabled are straightforward and don't require advanced networking skills.

Mapping Key Value Pairs onto JSON Objects in Oracle Integration Cloud

Wed, 2018-08-29 23:24

We were recently doing some work on a system, actually a Robotic Process Automation (RPA) endpoint, that generated unique JSON messages for each type of request.  So a single interface would expect different objects, depending on the request.  The target system actually required a small orchestration to submit a single request and so ideally a single integration would abstract the interface to that service.

To help reify things, here is an example:

Request One Create Order

This is the request to be sent to the generic service

{ "session" : "ABC123", "operation" : "createOrder", "data" : { "Customer" : "Antony", "Item" : "Stuffed Spinach Pizza" } }

Note that the data has named fields.

Request Two Get Order

This is another request to be sent to the same service, but a different operation results in a different payload.

{ "session" : "ABC123", "operation" : "getOrder", "data" : { "OrderID" : "112358", "FetchAllFields" : "True" } }

Note that the operation has changed and as a result the named fields in data are now different.

So even though the endpoint is the same, and the call preparation and tear down are the same, it appears we will have to have a unique integration for each type of request.  That sounds like a lot fo duplicate work :-(

The Problem

We want to have a generic interface to our target, but because it takes different data formats for different operations that means custom coding for each operation.  Within OIC we need to define the data shape so that when we map data we know what fields we are mapping.

Generic Interface Solution

Ideally we would like to have a single interface with a generic interface, something like the one below:

{ "session" : "ABC123", "operation" : "createOrder", "dictionary" : [ { "key" : "Customer", "value" : "Antony" }, { "key" : "Item", "value" : "Stuffed Spinach Pizza" } ] }

Note the use of key value pairs that allow us to pass arbitrary data into the integration.  The only problem is how do we map the data.  We need to create entries in the target data object corresponding to the keys in the source dictionary and then set those entries in the data object to the corresponding value from the dictionary.

Partial Solution

If we knew the target fields then we could use array indexing to select the correct value corresponding to the name as shown below:

Note that this is using the new JET based mapper which will shortly be available in Oracle Integration Cloud, it is currently in controlled availability.

Here we select the dictionary item whose "key" is Customer and puts its value into the Customer field.  This doesn't work if we don't know the field names as is the case for us!

This works if we know the target names when we build the integration and using the map above we can transform data as shown below

Source Target {    "session" : "ABC123",    "operation" : "createOrder",    "dictionary" : [      {        "key" : "Customer",        "value" : "Antony"      },      {        "key" : "Item",        "value" : "Stuffed Spinach Pizza"      }    ] } {     "session": "ABC123",     "operation": "createOrder",     "data": {         "Customer": "Antony"     } }

Unfortunately we don't always know the names ahead of time in which case this solution doesn't work.

A Generic Solution

So lets look at a more generic solution.  There is an XSL <element> tag that can be used to create an arbitrary element.  Unfortunately we have to use this in a hand crafted XSL as the mapper does not support the <element> tag - yet.

The process is:

  1. Map as much as you can using the mapper, including a single array lookup similar to the one above.
  2. Export your integration
  3. Unzip the integration
  4. Find your XSL map in the unzipped package (it will be in the folder icspackage/project/YOUR_INTEGRATION_NAME_VERSION/resources/processor_XX/resourcegroup_YY where XX and YY are arbitrary numbers)
  5. Edit the XSL map replacing the array mapping with the following:
    • Create a for loop over the base array element, the dictionary in our example
    • Create an element in the for-loop with name from key element and value from value element
      • <xsl:element name="{nsmpr0:key}"><xsl:value-of select="nsmpr0:value"/></xsl:element>
  6. Import the XSL into the integration

The mapping before and after looks like this

Before After <nsmpr0:response-wrapper xml:id="id_16">    <nsmpr0:session xml:id="id_17">       <xsl:value-of select="/nstrgmpr:execute/nsmpr0:request-wrapper/nsmpr0:session" xml:id="id_18"/>    </nsmpr0:session>    <nsmpr0:operation xml:id="id_19">       <xsl:value-of select="/nstrgmpr:execute/nsmpr0:request-wrapper/nsmpr0:operation" xml:id="id_20"/>    </nsmpr0:operation>    <nsmpr0:data xml:id="id_24">      <nsmpr0:Customer xml:id="id_25">         <xsl:value-of xml:id="id_26" select="/nstrgmpr:execute/nsmpr0:request-wrapper/nsmpr0:dictionary[nsmpr0:key = &quot;Customer&quot;]/nsmpr0:value"/>      </nsmpr0:Customer>   </nsmpr0:data> </nsmpr0:response-wrapper> <nsmpr0:response-wrapper xml:id="id_16">   <nsmpr0:session xml:id="id_17">      <xsl:value-of select="/nstrgmpr:execute/nsmpr0:request-wrapper/nsmpr0:session" xml:id="id_18"/>   </nsmpr0:session>   <nsmpr0:operation xml:id="id_19">      <xsl:value-of select="/nstrgmpr:execute/nsmpr0:request-wrapper/nsmpr0:operation" xml:id="id_20"/>   </nsmpr0:operation> <nsmpr0:data xml:id="id_24"> <xsl:for-each select="/nstrgmpr:execute/nsmpr0:request-wrapper/nsmpr0:dictionary">         <xsl:element name="{nsmpr0:key}">             <xsl:value-of select="nsmpr0:value"/>          </xsl:element>      </xsl:for-each>   </nsmpr0:data> </nsmpr0:response-wrapper>

Note that once imported into the integration you cannot edit it using the mapper in OIC.

JSon Note

The above example works fine because it is generating JSON and the XML/REST conversion in OIC does not pay attention to namespaces because there is no such construct in JSON.  If we wanted to do the same with XML output then we would nee to be more respectful of namespaces, although the element tag does support namespace specification.

Summary

We can deal with JSON data types that are unkown at design time by using Key Value pairs to dynamically construct the correct JSON objects.  This can be done in OIC and allows us to create generic integration wrappers to services that dynamically generate data types.

Reflecting Changes in Business Objects in UI Tables with Visual Builder

Mon, 2018-05-21 13:14

While the quick start wizards in Visual Builder Cloud Service (VBCS) make it very easy to create tables and other UI components and bind them to business objects, it is good to understand what is going on behind the scenes, and what the wizards actually do. Knowing this will help you achieve things that we still don't have wizards for.

For example - let's suppose you created a business object and then created a UI table that shows the fields from that business object in your page. You probably used the "Add Data" quick start wizard to do that. But then you remembered that you need one more column added to your business object, however after you added that one to the BO, you'll notice it is not automatically shown in the UI. That makes sense since we don't want to automatically show all the fields in a BO in the UI.

But how do you add this new column to the UI?

The table's Add Data wizard will be disabled at this point - so is your only option to drop and recreate the UI table? Of course not!

 

If you'll look into the table properties you'll see it is based on a page level ServiceDataProvider ( SDP for short) variable. This is a special type of object that the wizards create to represent collections. If you'll look at the variable, you'll see that it is returning data using a specific type. Note that the type is defined at the flow level - if you'll look at the type definition you'll see where the fields that make up the object are defined.

Type Definition

It is very easy to add a new field here - and modify the type to include the new column you added to the BO. Just make sure you are using the column's id - and not it's title - when you define the new field in the items array.

Now back in the UI you can easily modify the code of the table to add one more column that will be hooked up to this new field in the SDP that is based on the type.

Sounds complex? It really isn't - here is a 3 minute video showing the whole thing end to end:

As you see - a little understanding of the way VBCS works, makes it easy to go beyond the wizards and achieve anything.

European Privacy Requirements: Considerations for Retailers

Mon, 2018-05-21 11:52

When retailers throughout Europe adopt a new set of privacy and security regulations this week, it will be the first major revision of data protection guidelines in more than 20 years. The 2018 regulations address personal as well as financial data, and require that retailers use systems already designed to fulfill these protections by default.

In 1995, the European Commission adopted a Data Protection Directive that regulates the processing of personal data within the European Union. This gave rise to 27 different national data regulations, all of which remain intact today. In 2012, the EC announced that it would supersede these national regulations and unify data protection law across the EU by adopting a new set of requirements called the General Data Protection Regulation (GDPR).

The rules apply to any retailer selling to European consumers. The GDPR, which takes effect May 25, 2018, pertains to any company doing business in, or with citizens of, the European Union, and to both new and existing products and services. Organizations found to be in violation of the GDPR will face a steep penalty of 20 million euros or four percent of their gross annual revenue, whichever is greater.

Retailers Must Protect Consumers While Personalizing Offers

GDPR regulations will encompass personal as well as financial data, including much of the data found in a robust customer engagement system, CRM, or loyalty program. It also includes information not historically considered to be personal data: device IDs, IP addresses, log data, geolocation data, and, very likely, cookies.

For the majority of retailers relying on customer data to personalize offers, it is critically important to understand how to fulfill GDPR requirements and execute core retail, customer, and marketing operations. Developing an intimate relationship with consumers and delivering personalized offers means tapping into myriad data sources.

This can be done, but systems must be GDPR-compliant by design and by default. A key concept underlying the GDPR is Privacy by Design (PBD), which essentially stipulates that systems be designed to minimize the amount of personal data they collect. Beginning this week, Privacy by Design features will become a regulatory requirement for both Oracle and our customers and GDPR stipulates that these protections are, by default, turned on.

Implementing Security Control Features

While the GDPR requires “appropriate security and confidentiality,” exact security controls are not specified. However, a number of security control features are discussed in the text and will likely be required for certain types of data or processing. Among them are multi-factor authentication for cloud services, customer-configurable IP whitelisting, granular access controls (by record, data element, data type, or logs), encryption, anonymization, and tokenization.

Other security controls likely to be required are “separation of duties” (a customer option requiring two people to perform certain administrative tasks); customer options for marking some fields as sensitive and restricted; limited access on the part of the data controller (i.e. Oracle) to customer information; displaying only a portion of a data field; and the permanent removal of portions of a data element.

Summary of Critical GDPR Requirements

The GDPR includes a number of recommendations and requirements governing users’ overall approach to data gathering and use. Among the more important are:

  • Minimization. Users are required to minimize the amount of data used, length of time it is stored, the number of people who have access to it, and the extent of that access.
  • Retention and purging. Data may be retained for only as long as reasonably necessary. This applies in particular to personal data, which should be processed only if the purpose of processing cannot reasonably be fulfilled by other means. Services must delete customer data on completion of the services.
  • Exports and portability. End users must be provided with copies of their data in a structured, commonly used digital format. Customers will be required to allow end users to send data directly to a competing service provider for some services.
  • Access, correction, and deletion. End-user requests for data access, correction, and deletion for data they store in any service. Users may have a “right to be forgotten”—a right to have all their data erased.
  • Notice and consent. When information is collected, end-user notice and consent for data processing is generally required.
  • Backup and disaster recovery. Timely availability of end-user data must be ensured.

Are you prepared?

Oracle is prepared for the EU General Data Protection Regulation (GDPR) that was adopted by the European Parliament in April 2016 and will become effective on May 25, 2018. We welcome the positive changes it is expected to bring to our service offerings by providing a consistent and unified data protection regime for businesses across Europe. Oracle is committed to helping its customers address the GDPR’s new requirements that are relevant to our service offerings, including any applicable processor accountability requirements.

Our customers can rest assured that Oracle Retail’s omnichannel suite will empower them to continue delivering personalized customer experiences that meet complex global data privacy regulations. Contact Oracle Retail to learn more about Oracle systems, services and GDPR compliance: oneretailvoice_ww@oracle.com

 

 

 

 

New Oracle E-Business Suite Person Data Removal Tool Now Available

Mon, 2018-05-21 10:27

Oracle is pleased to announce the availability of the Oracle E-Business Suite Person Data Removal Tool, designed to remove (obfuscate) data associated with people in E-Business Suite systems. Customers can apply the tool to select information in their E-Business Suite production systems to help address internal operational and external regulatory requirements, such as the EU General Data Protection Regulation (GDPR).

For more details, see:

DP World Extends Strategic Collaboration with Oracle to Accelerate Global Digital ...

Mon, 2018-05-21 09:56

Global trade enabler DP World has extended its partnership with Oracle to implement its digital transformation programme that supports its strategy to develop complementary sectors in the global supply chain such as industrial parks, free zones and logistics. 

 

Suhail Al Banna, Senior Vice President, DP World, Middle East and Africa Region; Arun Khehar, Senior Vice President – Business Applications, ECEMEA, Oracle; Mohammed Al Muallem, CEO and Managing Director, DP World, UAE Region and CEO, JAFZA.

Suhail Al Banna, Senior Vice President, DP World, Middle East and Africa Region; Arun Khehar, Senior Vice President – Business Applications, ECEMEA, Oracle; Mohammed Al Muallem, CEO and Managing Director, DP World, UAE Region and CEO, JAFZA.

 

The move follows an announcement by DP World earlier this year to use the Oracle Cloud Suite of Applications drive business transformation. Oracle Consulting will now implement the full suite of Fusion Enterprise Resource Planning (ERP), Human Capital Management (HCM) and Enterprise Performance Management (EPM) Cloud solutions using its True Cloud methodology. The technology roll out across the Group has already started with the Group’s UAE Region and Middle East and Africa Region the first to sign up.

Teo Chin Seng, Senior Vice President IT, DP World Group, said:“Our focus on building our digital capability follows our vision to become a digitised global trade enabler and we working to achieve a new operational efficiency level while creating value for our stakeholders.”

Arun Khehar, Senior Vice President – Business Applications, ECEMEA, Oracle said:“Following the recent announcement of our strategic partnership to help DP World drive its global digital transformation with our best-in-class Cloud Suite of Applications (SaaS), we are proud to extend our collaboration by leveraging the deep expertise of Oracle Consulting to drive this large scale project. We are confident that this strategic cloud deployment will help them deliver the next level of innovation and differentiation.”

The Oracle Consulting team is focused exclusively on Oracle Cloud solutions and staffed with more than 7,000 experts in 175 countries serving more than 20 million users to help organizations implement Oracle Cloud in an efficient and cost-effective manner.

 

Further press releases Oracle Middle East Newsroom 

Experience, Not Conversion, is the Key to the Switching Economy

Mon, 2018-05-21 08:00

In a world increasingly defined by instant-gratification, the demand for positive and direct shopping experiences has risen exponentially. Today’s always-on customers are drawn to the most convenient products and services available. As a result, we are witnessing higher customer switching rates, with consumers focusing more on convenience than on branding, reputation, or even on price.  

In this switching economy – where information and services are always just a click away –  we tend to reach for what suits our needs in the shortest amount of time. This shift in decision making has made it harder than ever for businesses to build loyalty among their customers and to guarantee repeat purchases. According to recent research, only 1 in 5 consumers now consider it a hassle to switch between brands, while a third would rather shop for better deals than stay loyal to a single organization. 

What's Changed? 

The consumer mindset for one. And the switching tools available to customers have also changed. Customers now have the ability to research extensively before they purchase, with access to reviews and price comparison sites often meaning that consumers don’t even make it to a your website before being captured by a competitor. 

This poses a serious concern for those brands that have devoted their time – and marketing budgets – to building great customer experiences across their websites. 

Clearly this is not to say that on-site experiences aren’t important, but rather that they are only one part of the wider customer journey. In an environment as complex and fast moving as the switching economy, you must look to take a more omnichannel approach to experience, examining how your websites, mobile apps, customer service teams, external reviews and in-store experiences are all shaping the customers’ perceptions of your brand. 

What Still Needs to Change?

Only by getting to know your customers across all of these different channels can you future-proof your brand in the switching economy. To achieve this, you must establish a new set of metrics that go beyond website conversion. The days of conversion optimization being viewed as the secret sauce for competitive differentiation are over; now brands must recognize that high conversion rates are not necessarily synonymous with a great customer experience – or lifetime loyalty. 

Today, the real measure of success does not come from conversion, but from building a true understanding of your customers – across every touchpoint in the omnichannel journey. Through the rise of experience analytics, you finally have the tools and technologies needed to understand customers in this way, and to tailor all aspects of your brand to maximize convenience, encourage positive mindsets and pre-empt when your customers are planning to switch to a different brand. 

It is only through this additional layer of insight that businesses and brands will rebuild the notion of customer loyalty, and ultimately, overcome the challenges of the switching economy. 

Want to learn more about simplifying and improving the customer experience? Read Customer Experience Simplified: Deliver The Experience Your Customers Want to discover how to provide customer experiences that are managed as carefully as the product, the price, and the promotion of the marketing mix.

Customer Experience Simplified

If You Are Struggling With GDPR, Then You Are Not Alone

Mon, 2018-05-21 08:00

Well, it's only 5 days to go until the infamous GDPR deadline of 25th May 2018 and you can certainly see the activity accelerating.

You would have thought that with the deadline so close, most organisations would be sat back, relaxing, safe in the knowledge that they have had 2 years to prepare for GDPR, and therefore, are completely ready for it. It's true, some organisations are prepared and have spent the last 24 months working hard to meet the regulations. Sadly, there are also a significant proportion of companies who aren't quite ready. Some, because they have left it too late. Others, by choice.

Earlier this week I had the pleasure of being invited to sit on a panel discussing GDPR at Equinix's Innovation through Interconnection conference in London.

As with most panels, we had a very interesting discussion, talking about all aspects of GDPR including readiness, data sovereignty, healthcare, the role of Cloud, and the dreaded Brexit!

I have written before about GDPR, but this time I thought I would take a bit of time to summarise three of the more interesting discussion topics from the panel, particularly areas where I feel companies are struggling.

Are you including all of your personal right data?

There is a clear recognition that an organisation's customer data is in scope for GDPR. Indeed, my own personal email account has been inundated with opt-in consent emails from loads of companies, many of whom I had forgotten even had my data. Clearly, companies are making sure that they are addressing GDPR for their customers. However, I think there is a general concern that some organisations are missing some of the data, especially internal data, such as that of their employees. HR data is just as important when it comes to GDPR. I see some companies paying far less attention to this area than their customer's data.

Does Cloud help or hinder GDPR compliance?

A lot was discussed on the panel around the use of cloud. Personally, I think that cloud can be a great enabler, taking away some of the responsibility and overhead of implementing security controls, processes, and procedures and allowing the Data Processor (the Cloud Service Provider) to bring all of their experience, skill and resources into delivering you a secure environment. Of course, the use of Cloud also changes the dynamic. As the Data Controller, an organisation still has plenty of their own responsibility, including that of the data itself. Therefore, putting your systems and data into the Cloud doesn't allow you to wash your hands of the responsibility. However, it does allow you to focus on your smaller, more focused areas of responsibility. You can read more about shared responsiblity from Oracle's CISO, Gail Coury in this article. Of course, you need to make sure you pick the right cloud service provider to partner with. I'm sure I must have mentioned before that Oracle does Cloud and does it extremely well.

What are the real challenges customers are facing with GDPR?

I talk to lots of customers about GDPR and my observations were acknowledged during the panel discussion. Subject access rights is causing lots of headaches. To put it simply, I think we can break GDPR down into two main areas: Information Security and Subject Access Rights. Organisations have been implementing Information Security for many years (to varying degrees), especially if they have been subject to other legislations like PCI, HIPAA, SOX etc. However, whilst the UK Data Protection Act has always had principles around data subjects, GDPR really brings that front and centre. Implementing many of the principles associated with data subjects, i.e. me and you, can mean changes to applications, implementing new processes, identifying sources of data across an organisation etc. None of this is proving simple.

On a similar theme, responding to subject access rights due to this spread of data across an organisation is worrying many company service desks, concerned that come 25th May, they will be inundated with requests they cannot fulfil in a timely manner.

Oh and of course, that's before you even get to paper-based and unstructured data, which is proving to be a whole new level of challenge.

I could continue, but the above 3 areas are some of the main topics I am hearing over and over again with the customers I talk to. Hopefully, everyone has realised that there is no silver bullet for achieving GDPR compliance, and, for those companies who won't be ready in 5 days time, I hope you at least have a strong plan in place.

See What Your Guests Think with Data Visualization

Mon, 2018-05-21 06:00

As we approach the end of May, thoughts of summer and vacations begin. Naturally, a key component is finding the best place to stay and often that means considering the hotel options at your chosen destination. But what’s the best way to decide? That’s where reading reviews is so important.   

And that brings us to the latest blog in the series of taking datasets from ‘less typical’ sources and analyzing them with Oracle Data Visualization. Here, we’ve pulled the reviews from Booking.com as a dataset and visualized it to see how we – the general public - rate the hotels we stay in.

Working with Ismail Syed, pre-sales intern, and Harry Snart, pre-sales consultant, both from Oracle UK, we ran the analysis and created visualizations. We decided to look at the most common words used in both positive and negative reviews, see how long each of them is – and work out which countries are the most discerning when they give their feedback. 

So, what are the main irritations when we go away? Conversely - what's making a good impression?

Words of discontent

First, we wanted to combine the most commonly used words in a positive review with those most likely used in a negative review. You can see these in the stacked bar chart below. Interestingly, 'room' and 'staff' both appear in the positive and negative comments list. However, there are far more positive reviews around staff than negative ones, and likewise a lot more negative reviews around the room than positive reviews.

It seems then, across the board, guests find customer service better than the standard of the rooms they receive – implying an effective way to boost client retention would be by starting with improving rooms. In particular the small size of the rooms was complained about, that’s a tough fix, but people were more upset about the standard of the beds, their bathrooms and the toilets, which can be updated a bit more easily.

You’ll also notice 'breakfast' appears prominently in both the positive and negative word clouds – so a more achievable fix could be to start there. A bad breakfast can leave a bad taste, but a good one is obviously remembered. 

Who’ll give a good review?

Next, we wanted to see who the most complimentary reviewers were, by nationality. While North Americans, Australians and Kyrgyz (highlighted in green) tend to leave the most favorable reviews, hotels have a harder time impressing those from Madagascar, Nepal and Mali (in red). Europeans sit somewhere in the middle – except for Bosnia and Herzegovina, who like to leave an upbeat review.   

Next, we wanted to see who is the most verbose in their feedback – the negative reviewers or the positive reviewers – and which countries leave the longest posts.

Are shorter reviews sweeter?

Overall, negative reviews were slightly longer, but only by a small amount – contrary to the popular belief that we tend to ‘rant’ more when we’re perturbed about something. People from Trinidad and Tobago left the longest good reviews, at an average of 29 words. Those from Belarus, the USA and Canada followed as the wordiest positive reviewers. On the flip side, the Romanians, Swedish, Russians and Germans had a lot to say about their bad experiences – leaving an average of 22 words showing their displeasure.

It's business, but also personal...

Clearly data visualization doesn't necessarily just need to be a tool just for the workplace; you can deploy it to gain an insight into other aspects as well – including helping you prepare for some valuable time off.

If you’re an IT leader your organization and need to enable insights for everyone across business, you should consider a complete, connected and collaborative analytics platform like Oracle Analytics Cloud. Why not find out a bit more and get started for free.

If you simply interested in visual analysis of your own data? Why not see what you can find out by taking a look at our short demo and signing up for an Oracle Data Visualization trial?

Either way, make sure you and your business take a vacation from spreadsheets and discover far more from your data through visualization.

HR today: right skills, right place, right time, right price

Mon, 2018-05-21 05:49

The only constant in today’s work environment is change. If you’re going to grow and stay competitive in this era of digital transformation, your business has to keep up—and HR must too.

A wide range of factors all mean that HR constantly has to grow and transform—changing demographics, new business models, economic uncertainty, evolving employee expectations, the bring-your-own-device revolution, increased automation, AI, the relentless search for cost savings, and more.

Things are different today. In the past, business change processes typically had a start and target end date, with specific deliverables that were defined in advance. Now change is open-ended, and its objectives evolve over time—based on the world as it is, rather than a set of assumptions. An agile model for transformation is therefore essential, along with a decision-making process that can survive constant change.

The fact is that people are still—and will always be—the most important part of any business, so HR has to be closely aligned to your overall business goals, delivering benefits to the whole organisation. Every move your HR team makes should be focused on how to deliver the right skills in the right place, at the right time and at the right price, to achieve your business’s goals.

 

Workforce planning

To manage your workforce effectively as the needs of your business change, you need to know what talent you have, where it’s located—and also what skills you are likely to need in the future. It’s much easier to fill skills gaps when you can see, or anticipate, them.

 

Deliver maximum value from your own people

And it’s much easier to do if you’ve already nurtured a culture of personal improvement. Giving people new opportunities to learn and develop, and a sense of control over their own careers will help you maintain up-to-date skills within your business and also identify the most ideal candidates—whether for promotion, relocation within the company or to take on specific roles. Moreover, it should enable them to, for example, pursue areas of personal interest, train for qualifications, or perhaps work flexibly—all of which will improve loyalty and morale.

You can also look for skills gaps that you absolutely must recruit externally to fill, and understand how best to do that, especially at short notice. What are the most cost-efficient and effective channels, for example? You might consider whether offshoring for skills is helpful, or maintaining a base of experienced temporary workers that you can call on.

 

Unknown unknowns

Yet these are all known gaps. Organisations now also have to consider recruiting people for unknown jobs too. Some estimates suggest that as much as two-thirds of primary school children will end up working in jobs that don’t yet exist. So what new roles are being created in your industry, and how are you selecting people that will be able to grow into them?

 

Maximise the value of your HR function

Your HR organisation must be capable of, and ready to support these changes, and that means three things. First, the strategic workforce planning activities described above, supported by modern data and analytics. Next, HR has to provide the very best employee experience possible, enabling personal development and support. Finally, they need to be able to support the process of constant change itself, and move to a more agile way of operating.

 

Get the culture right

Creating and nurturing a strong culture is essential here, and that relies on close co-ordination between HR, line managers and employees. Having a core system of record on everyone’s roles and various skills supports all these objectives, and can help you to grow your business through the modern era of change.

 

Essential enablers for implementing a modern product strategy

Mon, 2018-05-21 05:49

Continuous improvement across your entire mix of products and services is essential to innovate and stay competitive nowadays. Digital disruption requires companies to transform, successfully manage a portfolio of profitable offerings, and deliver unprecedented levels of innovation and quality. But creating your product portfolio strategy is only the first part—four key best practices are necessary to successfully implement it.

New technologies—the Internet of Things (IoT), Big Data, Social Media, 3D printing, and digital collaboration and modelling tools—are creating powerful opportunities to innovate. Increasingly customer-centric propositions are being delivered ‘as-a-service’ via the cloud, with just-in-time fulfilment joining up multiple parts of the supply chain. Your products and services have to evolve continually to keep up, causing massive amounts of data to be generated that has to be fed back in to inform future development.

 

Common language

To minimise complexity, it’s essential that there is just one context for all communication. You therefore need a standardised—and well-understood—enterprise product record that acts as a common denominator for your business processes. And that means every last piece of information—from core service features to how your product uses IoT sensors; from business processes to your roadmap for innovation, and all other details—gets recorded in one place, in the same way, for every one of your products, from innovation through development to commercialisation.

That will make it far easier for you to collect and interpret product information; define service levels and deliver on them; support new business models, and manage the overall future design of your connected offerings. Moreover, it enables your product development methods to become more flexible, so they can be updated more frequently, enabled by innovations in your supply chain, supported more effectively by IT, and improved over time.

 

Greater quality control in the digital world…

By including form, fit and function rules—that describe the characteristics of your product, or part of it—within the product record, you add a vital layer of change control. It enables you to create a formal approvals process for quality assurance. For example, changes made in one area—whether to a product or part of it—may create problems in other areas. The form, fit and function rules force you to perform cross-functional impact analyses and ensure you’re aware of any consequences.

As part of this, you can run simulations with ‘digital twins’ to predict changes in performance and product behaviour before anything goes wrong. This obviously has major cost-saving implications, enabling far more to be understood at the drawing-board stage. Moreover, IoT applications can be leveraged to help product teams test and gather data of your connected assets or production facilities.

 

Transparency and effective communications

The enterprise product record should also contain a full audit trail of decisions about the product, including data from third parties, and from your supply chain. The objective is full traceability from the customer perspective—with evidence of regulatory compliance, provenance of preferred suppliers, and fully-auditable internal quality processes. Additionally, it’s often helpful to be able to prove the safety and quality of your product and processes, as that can be a key market differentiator. Powerful project management and social networking capabilities support the collaborative nature of the innovation process.

 

Lean and efficient

Overall, your innovation platform should be both lean and efficient, based on the continual iteration of the following key stages:

  • Ideation, where you capture, collaborate and analyse ideas
  • Proposal, where you create business cases and model potential features
  • Requirements, where you evaluate, collaborate and manage product needs
  • Concepts, where you accelerate product development and define structures
  • Portfolio analysis, where you revise and optimise your product investment
  • Seamless Integration with downstream ERP and Supply Chain processes

 

The result: Powerful ROI

Being able to innovate effectively in a digital supply chain delivers returns from both top-line growth—with increased revenues and market share—and reduced costs from improved safety, security, sustainability and fewer returns.

 

 

Cloud: Look before you leap—and discover unbelievable new agility

Mon, 2018-05-21 05:48

All around the world, finance teams are now fully embracing the cloud to simplify their operations. The heady allure of reduced costs, increased functionality, and other benefits are driving the migration. Yet what’s getting people really excited is the unexpected flush of new business agility they experience after they’ve made the change.

At long last, the cloud is becoming accepted as the default environment to simplify ERP and EPM. Fifty-six percent* of finance teams have already moved to the cloud—or will do so within the next year—and 24% more plan to move at some point soon.

 

Major cost benefits in the cloud

Businesses are making the change to enjoy a wide range of benefits. According to a recent survey by Oracle*, reducing costs is (predictably) the main motivation, with improved functionality in second place—and culture, timing and the ability to write-off existing investments also key factors. The financial motivation breaks down into a desire to avoid infrastructure investment and on-premises upgrades, and also to achieve a lower total cost of ownership.

And Cloud is delivering on its promise in all these areas—across both ERP and EPM, 70% say they have experienced economic benefits after moving to the cloud.

 

Leap for joy at cloud agility

But the biggest overall benefit of moving to the cloud—quoted by 85% of those who have made the change—is staying current on technology. Moreover, 75% say that cloud improves usability, 71% say it increases flexibility and 68% say that it enables them to deploy faster. Financial gain is the top motivation for moving to the cloud, but that’s only the fourth-ranked advantage overall once there. It turns out that the main strengths of the cloud are in areas that help finance organisations improve business agility.

These are pretty amazing numbers. It would be unheard of, until fairly recently, for any decent-sized organisation to consider migrating its core ERP or EPM systems without a very, very good reason. Now, the majority of companies believe that the advantages of such a move—and specifically, moving to the cloud—overwhelm any downside.

 

The commercial imperative

Indeed, the benefits are more likely viewed as a competitive necessity. Cloud eliminates the old cycle of new system launches every two or three years—replacing it with incremental upgrades several times each year, and easy, instant access to additional features and capabilities.

And that is, no doubt, what’s behind the figures above. Finance professionals have an increasingly strong appetite to experiment with and exploit the latest technologies. AI, robotic process automation, internet of things, intelligent bots, augmented reality and blockchain are all being evaluated and used by significant numbers of organisations.

They’re improving efficiency in their day-to-day operations, joining-up operating processes across their business and reducing manual effort (and human error) through increased automation. Moreover, AI is increasingly being applied to analytics to find answers to compelling new questions that were, themselves, previously unthinkable—providing powerful new strategic insights.

Finance organisations are becoming more agile—able to think smarter, work more flexibly, and act faster using the very latest technical capabilities.

 

But it’s only available via cloud-based ERP and EPM

Increasingly, all these advances are only being developed as part of cloud-based platforms. And more and more advanced features are filtering down to entry-level cloud solutions—at least in basic form—encouraging finance people everywhere to experiment with what’s possible. That means, if you’re not yet using these tools in the cloud, you’re most likely falling behind your competitors that are—and that applies both from the broader business perspective as well as from the internal operating competency viewpoint.

The cloud makes it simple to deploy, integrate and experiment with new capabilities, alongside whatever you may already have in place. It has become the new normal in finance. It seems like we’re now at a watershed moment where those that embrace the potential of cloud will accelerate away from those that do not, and potentially achieve unassailable new operating efficiencies.

The good news is that it’s easy to get started.  According to MIT Technology Review in a 2017 report, 86% of those making a transition to the cloud said the costs were in line with, or better than expected, and 87% said that the timeframe of transition to the cloud was in line with, or better than expected.

_______

* Except where stated otherwise, all figures in this article are taken from ‘Combined ERP and EPM Cloud Trends for 2018’, Oracle, 2018.

 

You’ve got to start with the customer experience

Mon, 2018-05-21 05:47

Visionary business leader Steve Jobs once remarked: ‘You’ve got to start with the customer experience and work backwards to the technology.’ From someone who spent his life creating definitive customer experiences in technology itself, these words should carry some weight—and are as true today as ever.

The fact is that customer experience is a science, and relevance is its key goal. A powerful customer experience is essential to compete today. And relevance is what cuts through the noise of the market to actually make the connection with customers.

 

The fundamentals of success

For companies to transform their customer experience, they need to be able to streamline their processes and create innovative customer experiences. They also have to be able to deliver by connecting all their internal teams together so they always speak with one consistent voice.

But that’s only part of the story. Customers have real choice today. They’re inundated with similar messages to yours and are becoming increasingly discerning in their tastes.

Making yourself relevant depends on the strength of your offering and content, and the effectiveness of your audience targeting. It also depends on your technical capabilities. Many of your competitors will already be experimenting with powerful new technologies to increase loyalty and drive stronger margins.

 

The value of data

Learning to collect and use relevant customer data is essential. Data is the lifeblood of modern business—it’s the basis of being able to deliver any kind of personalised service on a large scale. Businesses need to use data to analyse behaviour, create profiles for potential new customers, build propositions around those target personas and then deliver a compelling experience. They also need to continually capture new data at every touchpoint to constantly improve their offerings.

Artificial intelligence (AI) and machine learning (ML) have a key role to play both in the analysis of the data and also in the automation of the customer experience. These technologies are developing at speed to enable us to improve our data analysis, pre-empt changing customer tastes and automate parts of service delivery.

 

More mature digital marketing

You can also now add in all kinds of technologies to the customer experience mix that are straight out of sci-fi. The internet of things (IoT) is here, with connected devices providing help in all kinds of areas—from keeping you on the right road to telling you when your vehicle needs maintenance, from providing updates on your order status to delivering personal service wherever you are, and much more—enabling you to drive real transformation.

Moreover, intelligent bots are making it much easier to provide high-quality, cost-effective, round-the-clock customer support—able to deal with a wide range of issues—and using ML to improve their own performance over time.

Augmented reality makes it possible to add contextual information, based on your own products and services, to real-world moments. So, if you’re a car manufacturer you may wish to provide help with simple roadside repairs (e.g. change of tire) via a smartphone app.

 

Always omnichannel

Finally, whether at the pre-sale or delivery stage, your customer experience platform must give you the ability to deliver consistency at every touchpoint. Whatever channel, whatever time, whatever context, your customers must all believe that your whole business is one person.

Indeed, as Michael Schrage, author of the Harvard Business Review, said: ‘Innovation is an investment in the capabilities and competencies of your customers. Your future depends on their future.’ So you have to get as close as possible to your customers to learn what they want today, and understand what experiences they are likely to want tomorrow. Work backwards from that and use any technology that can help you deliver it.

How APIs help make application integration intelligent

Mon, 2018-05-21 05:47

Artificial intelligence (AI) represents a technology paradigm shift, with the potential to completely revolutionise the way people work over the next few years. Application programming interfaces (APIs) are crucially important in enabling the rapid development of these AI applications. Conversely AI is also being used to validate APIs, themselves, and also to analyse and optimise their performance.

Wikipedia defines an API as a ‘set of subroutine definitions, protocols and tools for building application software’. In slightly less dry terms, an API is basically a gateway to the core capabilities of an application, enabling that functionality to be built into other software. So, for example, if you were creating an app that needed to show geographic location, you might choose to implement Google Maps’ API. It’s obviously much easier, faster and future-proof to do that than to build your own mapping application from scratch.

 

How APIs are used in AI

And that’s the key strength of API—it’s a hugely efficient way of enabling networked systems to communicate and draw on each other’s functionality, offering major benefits for creating AI applications.

Artificially intelligent machine ‘skills’ are, of course, just applications that can be provided as APIs. So if you ask your voice-activated smart device—whether it’s Siri, Cortana, Google Assistant, or any of the rest—what time you can get to the Town Hall via bus, it’s response will depend on various skills that might include:

  • Awareness of where you are—from a geo-location API
  • Knowledge of bus routes and service delays in your area—from a publicly available bus company API
  • Tracking of general traffic and passenger levels—from APIs that show user locations provided by mobile device manufacturers
  • Being able to find the town hall—from a mapping API

None of these APIs needs to know anything about the others. They simply take information in a pre-defined format and output data in their own way. The AI application, itself, has to understand each API’s data parameters, tie all their skills together, apply the intelligence and then process the data.

 

Everything is possible

That means you can combine the seemingly infinite number of APIs that exist in any way you like, giving you the power to produce highly advanced applications—and create unique sources of value for your business. You could potentially build apps to enhance the customer experience, improve your internal processes, and analyse data more effectively to strengthen decision making—and perhaps even identify whole new areas of business to get into.

 

How AI is being used to improve APIs

APIs are the ideal way of getting information into AI applications and also helping to streamline analytics—yet artificial intelligence also has a vital role to play within API development itself. For example, AI can be used to automatically create, validate and maintain API software development kits (implementations of APIs in multiple different programming languages).

AI can also be used to monitor API traffic. By analysing calls to APIs using intelligent algorithms, you can identify problems and trends, potentially helping you tailor and improve the APIs over time. Indeed, AI can be used to analyse internal company system APIs, for example, helping you score sales leads, predict customer behaviour, optimise elements of your supply chain, and much more.

 

GDPR: What are the priorities for the IT department?

Mon, 2018-05-21 05:46

All too often it is assumed that GDPR compliance is ‘IT’s problem’ because having your personal data and technology in order are such vital parts of it. But compliance must be an organisation-wide commitment. No individual or single department can make an organisation compliant. However, in planning discussions around GDPR compliance, there are clear areas where IT can add significant value.

 

1. Be a data champion

The potential value of data to organisations is increasing all the time, but many departments, business units and even board members may not realise how much data they have access to, where it resides, how it is created, how it could be used and how it is protected. The IT department can play a clear role in helping organisations understand why data, and by extension GDPR, is so important in order to realise the value of such data and how to use and protect it.

 

2. Ensure data security

GDPR considers protection of personal data a fundamental human right. Organisations need to ensure they understand what personal data they have access to and put in place appropriate protective measures. IT has a role to play in working with the organisation to assess security risks and ensure that appropriate protective measures, such as encryption, access controls, attack prevention and detection are in place.

 

3. Help the organisation be responsive

GDPR requires organisations to not only protect personal data but also respond to requests from individuals who, among others, want to amend or delete data held on them. That means that the personal data must be collected, collated and structured in a way that enables effective and reliable control over all personal data. This means breaking down internal silos and ensuring an organisation has a clear view of its processing activities with regard to personal data.

 

4. Identify the best tools for the job

GDPR compliance is as much about process, culture and planning as it is about technology. However, there are products available which can help organisations with key elements of GDPR compliance, such as data management, security and the automated enforcement of security measures. Advances in automation and artificial intelligence mean many tools offer a level of proactivity and scalability which don’t lessen the responsibility upon people within the organisation, but can reduce the workload and put in place an approach which can evolve with changing compliance requirements.

 

5. See the potential

An improved approach to security and compliance management, fit for the digital economy, can give organisations the confidence to unlock the full potential of their data. If data is more secure, better ordered and easier to make sense of, it stands to reason an organisation can do more with it. It may be tempting to see GDPR as an unwelcome chore. It should however be borne in mind that it is also an opportunity to seek differentiation and greater value, to build new data-driven business models, confident in the knowledge that the data is being used in a compliant way.  Giving consumers the confidence to share their data is also good for businesses.

 

The IT department will know better than most how the full value of data can be unlocked and can help businesses pull away from seeing GDPR as a cost of doing business and start seeing it as an opportunity to do business better.

Autonomous: A New Lens for Analytics

Mon, 2018-05-21 05:45

Welcome to the era of intelligent, self-driving software. Just as self-driving vehicles are set to transform motoring, self-driving software promises to transform our productivity, and strengthen our analytical abilities.

Perhaps you drive an automatic car today—how much are you looking forward to the day your car will automatically drive you? And how much more preferable would smoother, less time-consuming journeys be—always via the best route, with fewer hold-ups, and automatically avoiding unexpected road congestion—where you only have to input your destination? The technology is almost here, and similar advances are driving modern business applications.

AI and machine learning are finally coming of age thanks to the recent advances in big data that created—for the first time—data sets that were large enough for computers to draw inferences and learn from. That, along with years of SaaS application development in cloud computing environments, means that autonomous technology—harnessing both AI and business intelligence—is now fuelling self-driving software… for both cars and cloud applications.

 

Autonomy—beyond automation

Automation has, of course, been around for years. But autonomy—running on AI and machine learning—takes it to new levels. Today’s software is truly self-driving—it eliminates the need for humans to provision, secure, monitor, back-up, recover, troubleshoot or tune. It upgrades and patches itself, and automatically applies security updates, all while running normally. Indeed, an autonomous data warehouse, for example, can reduce administration overheads by up to 80%.

 

Intelligent thinking

But the greatest value is perhaps in what AI enables you to discover from your data. When applied to analytics, it can identify patterns in huge data sets that might otherwise go unnoticed. So, for example, you could apply AI to sales data to identify trends—who bought what, where, when and why?—and apply those to improve the accuracy of your future forecasts.

Alternatively, if you were looking for a vibrant location for new business premises, you might use AI to search for an area with a strong social media buzz around its restaurants and bars. You could teach the software to look for specific words or phrases, and harness machine learning to improve results over time.

AI technology is already widely used in HR to take the slog out of sifting through huge numbers of job applications. As well as being faster and requiring less manpower, it’s able to remove both human bias—critical in the highly subjective area of recruitment—and also identify the best candidates based on factors such as the kind of language they use.

 

Knowledge and power for everyone

These technologies are coming online now—today—for everyone. In the past, most database reporting was typically run by data analysts or scientists to update pre-existing dashboards and reports. Nowadays there are many more business users who are demanding access to such insights, which is being made possible by tools that are far easier to use.

Anyone can experiment with large samples of different data sets, combining multiple data formats—structured and unstructured—and discovering new trends. They can get answers in context, at the right time, and convert them into simple-to-understand insights, enabling decisions to be made more quickly for competitive advantages.

 

Smarter and smarter…

Yet it’s the strength of those insights that’s really compelling. As one commentator observed: ‘Machine intelligence can give you answers to questions that you haven’t even thought of.’ The quality of those answers—and their underlying questions—will only improve over time. That’s why it’s becoming a competitive imperative to embrace the power of intelligent analytics to ensure you can keep pace with market leaders.

 

Discover how…

In my last blog, I shared how organisations can profit from data warehouses and data marts, and how Oracle’s self-driving, self-securing, and self-repairing Autonomous Data Warehouse saves resources on maintenance allowing investment in data analytics.

 

CPQ is an Auditor’s Best Friend

Mon, 2018-05-21 03:00

By Andy Pieroux, Founder and Managing Director of Walpole Partnership Ltd.  

One of the reasons many companies invest in a Configure, Price and Quote (CPQ) system is to provide a robust audit trail for their pricing decisions. Let’s take a look at why, and how CPQ can help.


First, apologies if you are an auditor. I’ve always been on the business side - either in sales, sales management, or as a pricing manager. I can appreciate your view may be different from the other side of the controls. Perhaps by the end of this article our points of view may become closer?

If your business has the potential to get audited, I know that I can speak on your behalf to say we all just love being audited. We love the time taken away from our day jobs. We love the stress of feeling that something may be unearthed that exposes us or gets us in trouble, even if we’ve never knowingly done anything wrong. We love the thought of our practices being exposed as 'in need of improvement' and relish the chance to dig through old documents and folders to try and piece together the story of why we did what we did… especially when it was several years ago. Yes sir, bring on the audit.

The reason we love it so much is that in our heart of hearts, we know audits are needed for our organization to prosper in the future. We dread the thought that our company might be caught up in a scandal like the mis-selling of pensions, or PPI (payment protection insurance), or serious accounting frauds like Enron.

It was scandals like Enron in the early 2000s that gave rise to stricter audit requirements and Sarbanes-Oxley (SOX).  This set a high standard required for internal controls, and much tougher penalties for board members who fail to ensure that financial statements are accurate. The role of pricing decisions (e.g. who authorized what and when), and the accuracy of revenue reporting becomes paramount when evidencing compliance with audit arrangements such as this.

At this point, a CPQ system can be the simple answer to your audit needs. All requests for discount, and the way revenue is allocated across products and services is documented. All approvals can be; attributed to an individual, time stamped, and with reasons captured at the time of approval. More importantly, the ability to show an auditor the entire history of a decision and to follow the breadcrumbs from a signed deal all the way to reported revenue at the click of a button means you have nothing to hide, and a clear understanding of the decisions. This is music to an auditor’s ears. It builds trust and confidence in the process and means any anomalies can be quickly analyzed.

When you have all this information securely stored in the cloud, under controlled access to only those who need it, and a tamper-proof process, that means it is designed with integrity in mind, and makes the process of passing an audit so much easier. All the anxiety and pain mentioned above disappears. Auditors are no longer the enemy. You will find they can help advise on improvements to the rules in your system to make future audits even more enjoyable. Yes - that’s right…. I said it. Enjoyable Audits!

So, CPQ is an auditor’s friend, and an auditee’s friend too. It doesn’t just apply to the big-scale audit requirements like SOX, but any organization that is auditable. Whether you’re a telecommunications company affected by IFRS 15, an organization impacted by GDPR, or any one of a thousand other guidelines, rules or quality policies that get checked - having data and decisions stored in a CPQ system will make you love audits too.

 

 

Pages