PeopleSoft Technology Blog

Subscribe to PeopleSoft Technology Blog feed
Oracle Blogs
Updated: 8 hours 39 min ago

Using Drop Zones to Isolate Customizations

Thu, 2019-04-25 13:33

The ability to customize PeopleSoft applications has always been a powerful and popular aspect of PeopleSoft products.  This power and flexibility came at a cost, however. Although customizations are valuable in that they enable customers to meet unique and important requirements that are not part of the standard delivered products, they are difficult and costly to maintain.

PeopleSoft has been working hard to enable customers to continue developing those valuable customizations, but implement them in a way that they are isolated from our delivered products.  This minimizes life cycle impact and allows customers to take new images without having to re-implement customizations each time.  Providing the ability to isolate customizations is a high priority investment for us.  We've developed several features that facilitate the ability to isolate customizations.  The latest is Drop Zones.  Drop Zones became available with PeopleTools 8.57, and customers must be on 8.57 to implement them. 

First let's look at the benefits as well as things you must consider when using Drop Zones:

  • Customers can add custom fields and other page elements without life cycle impact
  • You have the full power of PeopleTools within Drop Zones.  You can apply PeopleCode to custom elements
  • Reduces LCM time when taking new image.  No need to re-implement customizations!
  • Works on Fluid pages
  • It's still developer work, some of which is done in App Designer
  • Doesn't reduce implementation time for customization (the benefit is during LCM)
  • Some pages won't work with Drop Zones
  • No support for Classic pages (as of 8.57)
What Does PeopleSoft Deliver?

You can only use drop zones on pages delivered by PeopleSoft applications.  (Don't add your own--that would be a customization.)  Drop Zones will be delivered with the following application images:

  • FSCM 31
  • HCM 30
  • ELM 19
  • CRM 17

Our application teams are delivering drop zones in pages where customizations are most common.  This was determined in consultation with customers.  Typical pages have two drop zones: one at the top, the other at the bottom.  However, there may be cases with more or fewer drop zones.

How Do I Implement Drop Zones?

Move to PeopleTools 8.57 or later. Take Application images that have drop zones.

  1. Review and catalog your page customizations and determine whether they can be moved to Drop Zones.  Compare your list to delivered pages with Drop Zones.  (Lists for all applications are available on>Key Concepts>Configuration.)

  2. Create subpages with customizations you want to implement (custom fields, labels, other widgets...)  In the example here, we've created a simple subpage with static text and a read-only Employee ID field.
  3. Insert your custom subpage into the Drop Zone.
  4. Configure your subpage to the component containing the page.  There may be more than one drop zone available, so make sure you choose the one you want.  Subpages are identified by labels on their group boxes.

    Your custom subpage will be dynamically inserted at runtime.  Any fields and other changes on your subpages are loaded into the component buffer along with delivered content.  Your subpages are displayed as if part of the main page definition.  End users will see no difference between custom content and delivered content.

Now this customization is isolated, and will not be affected when you take the next application image.  Your customization will be carried forward, and you don't have to re-implement it every time you take a new image.  These changes will not appear in a Compare Report.

What Can I Do to Prepare for Using Drop Zones?

Even if you are not yet on an application image that contains Drop Zones, you can prepare ahead of time, making implementation faster. 

  1. Review and catalog your page customizations and compare them against the pages with delivered drop zones in the images you will eventually uptake.  
  2. Consider which page customizations you want to implement with Drop Zones.  Prioritize them.
  3. Start building subpages containing those customizations.

When you move to the application images that contain drop zones, you can simply insert the subpages you've created as described above.

See this video for a nice example of islolating a customization with Drop Zones.


PeopleTools 8.57 is now available to download to your site

Tue, 2019-01-15 13:42

Oracle announced yesterday that PeopleTools 8.57.04 Product Patch is now available for on-premises downloads.  Go to the PeopleTools 8.57 Home Page on My Oracle Support (doc 2433119.2) for more information and a link to the PeopleSoft Patches Home Page.  This patch is also available on Oracle Cloud Infrastructure (OCI).  All customers using PeopleSoft Cloud Manager will be able to subscribe to the PeopleTools 8.57 channel and have the new patch automatically downloaded to their cloud repository and ready for one-click upgrade/patching.

I'd like to extend thanks to all the customers and partners that took the time to upgrade a PeopleSoft instance to 8.57 using PeopleSoft Cloud Manager.  Many have shared their stories in webinars, conference sessions and blog posts.  One of my favorites is a blog post from Graham Smith, Cedar Consulting, who in this post 'PeopleTools 8.57 Upgrade Magic' coined the term 'Magic Upgrade PeopleTools Button'.  The name is going to stick.

Visit my previous post PeopleTools 8.57 is Available on the Oracle Cloud for more information and links to additional resources on the new Tools release.  Also, don't forget to stay informed with the Tech Updates (doc 764222.1).  There are important updates about how 8.57 is the last PeopleTools release that will support MAP technology and the DB2 LUW platform.

For those of you that would still like to try the 'Magic Upgrade Button' in the cloud, you can still do that with a free trial account.  Instructions have been made into a spotlight video, Upgrading to 8.57 on Oracle Cloud using PeopleSoft Cloud Manager.  You can find a link to that video and all spotlight videos here.






Migrating PeopleSoft Applications to Oracle Cloud Infrastructure

Mon, 2019-01-07 19:00

There is a great opportunity to learn more about running your PeopleSoft Applications on Oracle Cloud Infrastructure (OCI).  OCI is Oracle's leading Infrastructure as a Service (IaaS) Cloud Platform, and many PeopleSoft customers are choosing it as the IaaS platform for PeopleSoft.

In this virtual workshop you'll hear all the technical details about OCI and see firsthand how you as a PeopleSoft customer can benefit.

  • Title: Migrating PeopleSoft to Oracle Cloud Infrastructure
  • Time: Wednesday, January 16, 12:00pm EST/9:00am PST

Click here to register for the event.

Specialist will be on the call to answer any of your questions after the webinar.

Don't miss this exiting event!

Elasticsearch 6.1.2 Will Soon Be Available

Wed, 2019-01-02 16:18

Elasticsearch 6.1.2 is the minimum version planned for PeopleTools with release 8.57 (more information on that coming soon).  In addition, 6.1.2 is also planned for availability on PeopleTools 8.55 and 8.56.  Upgrade to Elasticsearch 6.1.2 will require full indexing of all search definitions.  Customers will be able to use the Live Cutover feature in the PeopleSoft Search Framework to migrate to Elasticsearch 6.1.2 with no downtime.

If you are using Elasticsearch 2.3.2, and want to continue on it, that version will be supported for one year following our support of version 6.1.2.  At that point, Oracle will only support version 6.1.2. Note that support for PeopleTools 8.55 ends before October 2019, so Elasticsearch 2.3.2 and 6.1.2 support for 8.55 ends with the last CPU for 8.55.  At that point no fixes to the PeopleSoft Search Framework or Elasticsearch will be available for 8.55.

Look for more announcements on this important transition including new features for Search that will be available with PeopleTools 8.57.  These will be posted on the Search concepts page on

How to Launch PeopleSoft Cloud Manager Using a Pre-built Image for Oracle Cloud Infrastructure

Wed, 2018-11-14 23:58

With the latest updates to Oracle Cloud Infrastructure, you now have a cool new way of launching instances.  This new feature makes it easier than ever before to launch a PeopleSoft Cloud Manager instance.  We received your feedback on how downloading and uploading Cloud Manager was taking too much time.  I’m very happy to say that we’ve heard you, and now we have delivered a surprisingly simple solution.

Before creating the Cloud Manager instance, you should have set up your networking (VCN and subnet) and configured all required security rules.

To launch a new Cloud Manager instance, navigate to your tenancy and go to the dashboard, as shown here, or the Instances page.  From the dashboard, click the option Create a VM instance. 

You will now see the new pages to create an instance as shown below.  Enter the name for your instance and select the availability domain (AD) in which you want to create your Cloud Manager instance. 

Next step is to select the Cloud Manager image. By default, Oracle Linux 7.5 image is chosen.  Click Change Image Source to search for the Cloud Manager image.

On the Browse All Images page, select the Oracle Images tab.  Here you’ll find the latest Cloud Manager image for OCI.  Select the image and accept the terms and restrictions after reading them. Click Select Image to use the chosen image.

Choose the Virtual Machine instance type and select a VM shape of your choice. 

Select a compartment in which the Cloud Manager instance will be created.  Select a VCN and a subnet for the Cloud Manager instance configuration. Click Create to deploy the instance.

There you go!  Creating a Cloud Manager instance is now so easy. 

Note that you still need to download the Oracle Linux Image for Cloud Manager from My Oracle Support, upload it to object storage and import as a custom image.  After setting up Cloud Manager instance, continue from Downloading the Oracle Linux Image and Uploading to Object Storage step in the install guide here.

After you have the Oracle Linux image, and the Cloud Manager instance is provisioned and running, SSH into the instance and follow the instructions in the install guide to run the Cloud Manager Instance Configuration Wizard.

How to run PeopleTools 8.57 in the Cloud

Sat, 2018-09-22 01:13

Now that 8.57 is generally available to run in the cloud, I’ll outline the steps to get it up and running so you can take it for a spin. It was announced in an earlier blog that PeopleTools 8.57 will be initially released to run on the Oracle Cloud before it’s available for on-premises installs.  That means you’ll want to use PeopleSoft Cloud Manager 7 to get it up and running.  There’s probably a pretty good chance that this is the first exposure you’ve had to Cloud Manager so it’s worth explaining what you’ll do.

The process is pretty simple.  First, you’ll get an Oracle Cloud Infrastructure (OCI) account.  Then you’ll install Cloud Manager 7 onto that account.  Once installed, Cloud Manager 7 is used to download the 8.57 DPKs into your Cloud Repository.  Then you’ll use Cloud Manager 7 to upgrade a PUM image from 8.56 to 8.57 using the downloaded DPKs.  If you’ve been through a PeopleTools upgrade before, you’ll be amazed how Cloud Manager automates this process.

The steps outlined below assume that you are new to OCI and Cloud Manager.  If you are already familiar with Cloud Manager and have it running in your tenancy, then update your Cloud Manager instance using Interaction Hub Image 07 as described here and go to Step 4.

Step 1: Get your Oracle Cloud trial account.  If you are new to OCI and PeopleSoft Cloud Manager, then please go through all links below to help you get started quickly.

Using the link given above, request for a free account that will give you access to all Oracle Cloud services up to 3500 hours or USD 300 free credits (available in select countries and limited validity). When your request is processed, you will be provisioned a tenancy in Oracle Cloud Infrastructure. Oracle will send you a Welcome email with instructions for signing in to the console for the first time. There is no charge unless you choose to Upgrade to Paid from My Services in the console.

Step 2: Set up your OCI tenancy by creating users, policies and networks.  Refer to OCI documentation here for details or get a quick overview in this blog.  After this step, you will have your tenancy ready to deploy PeopleSoft Cloud Manager. If you are on OCI Classic then you can skip this step.

Step 3: Install and configure PeopleSoft Cloud Manager.  Follow the OCI install guide here. If you are on OCI Classic, follow the install guide here.

As part of this step, you will –

  • Download the Cloud Manager images
  • Upload images to your tenancy
  • Create a custom Microsoft Windows 2012 image
  • Spin up the Cloud Manager instance
  • Run bootstrap to install Cloud Manager
  • Configure Cloud Manager settings
  • Set up file server repository

Step 4: Subscribe to download channels.  The PeopleTools 8.57 download channel is now available under the unsubscribed channel list.  Navigate to Dashboard | Repository | Download Subscriptions.  Click on Unsubscribed tab. Subscribe to the new Tools_857_Linux channel using the related Actions menu. Also subscribe to any application download channels, for example, HCM_92_Linux. Downloading new PeopleTools version takes a while. Wait for the status to indicate success.

Step 5:  Provision a demo environment using PUM Fulltier topology.  Create a new environment template to deploy an application that you downloaded in Step 4.  Use the provided PUM Fulltier topology in the template.  Using this newly created template, provision a new environment. If you already have a PUM environment deployed through Cloud Manager, then you can skip this step. 

Step 6: After the environment is provisioned, navigate to Environment Details | Upgrade PeopleTools.  On the right pane, you’ll have an option to select the PeopleTools 8.57 version that you have downloaded.  Select the PeopleTools version that you want to evaluate. Click Upgrade to begin the upgrade process. 

You’ll be able to see the job details and the steps running on the same page.  Click on the ‘In progress’ status link to view the upgrade job status. 

After the upgrade is complete, click the status link for detailed upgrade job status.

PeopleTools upgrade is now complete.  Login to the upgraded environment PIA to evaluate the new features in PeopleTools 8.57.  For more info on PeopleTools 8.57 go to PeopleTools 8.57 Documentation

PeopleTools 8.57 is Available on the Oracle Cloud

Fri, 2018-09-21 15:17

We are pleased to announce that PeopleTools 8.57 is generally available for install and upgrade on the Oracle Cloud.  As we announced earlier, PeopleTools 8.57 will initially be available only on the Oracle Cloud.  We plan to make PeopleTools 8.57 available for on-premises downloads with the 8.57.04 CPU patch in January 2019.  

There are many new exiting features in PeopleTools 8.57 including:

  • The ability for end-users to set up conditions in analytics that if met will notify the user
  • Improvements to the way Related Content and Analytics are displayed
  • Add custom fields to Fluid pages with minimum life-cycle impact
  • More capabilities for end user personalization
  • Improved search that supports multi-facet selections
  • Easier than ever to brand the application with your corporate colors and graphics
  • Fluid page preview in AppDesigner and improved UI properties interface
  • End-to-end command-line support for life-cycle management processes
  • And much more!

You’ll want to get all the details and learn about the new features in 8.57.  A great place to start is the PeopleTools 8.57 Highlights Video  posted on the PeopleSoft YouTube channel.  The highlights video gives you a overview of the new features and shows how to use them.

There is plenty more information about the release available today.  Here are some links to some of the other places you can go to learn more about 8.57:

In addition to releasing PeopleTools 8.57, version 7 of PeopleSoft Cloud Manager is also being released today.  CM 7 is similar in functionality to CM 6 with additional support for PeopleTools 8.57.  If you currently use a version of Cloud Manager you must upgrade to version 7 in order to install PT 8.57. 

There are a lot of questions about how to get started using PeopleTools 8.57 and Cloud Manager 7.  Documentation and installation instructions are available on the Cloud Manager Home Page.

More information will be published over the next couple of weeks to help you get started with 8.57 on the cloud.  Additional information will include blogs to help with details of the installation, an video that shows the complete process from creating a free trial account to running PT8.57, and a detailed Spotlight Video that describes configuring OCI and Cloud Manager 7.

PeopleTools 8.57 is a significant milestone for Oracle, making it easier than ever for customers to use, maintain and run PeopleSoft Applications.

How to upload PeopleSoft Cloud Manager 6 Images to OCI

Mon, 2018-08-06 00:46

In previous blog, we set up OCI tenancy.  In this blog let us understand how to get Cloud Manager images into your tenancy.  In OCI Classic, Cloud Manager images can be easily downloaded into your identity domain simply by going to Oracle Cloud Marketplace and using the Get App feature. Currently there is not a similar capability for OCI. Instead, you must download the Cloud Manager OCI images to a local system and then upload them into an object storage bucket in your tenancy. 

Although you cannot use the Get App feature in Oracle Cloud Marketplace to automatically download the Cloud Manager Images into OCI, you begin in Oracle Cloud Marketplace to locate the Cloud Manager images for OCI.

Locate and Download the Images from My Oracle Support

To begin, go to Oracle Cloud Marketplace, search for Cloud Manager, and select the instance for PeopleSoft Cloud manager Image 06 for OCI, as shown in this example.

When you click on ‘Get App’ on this listing, you will be redirected to a My Oracle Support Knowledge Base document with links to the Cloud Manager Image 6.0 and a PeopleSoft Linux image.  The PeopleSoft Linux Image for Cloud Manager is an Oracle Linux 6.9 image that can be used to deploy PeopleSoft environments.  You can use this Linux image with no further changes to get started, or use it as a base image to create your own custom image by installing any packages that are required by your organization. 

You can download these images on any existing Linux or Microsoft Windows instance in OCI, or use your local on-premise system. To upload these images to your OCI tenancy, you must first install OCI Command Line Interface (CLI).   Since there are multiple methods to install and configure OCI CLI, let’s take a quick look at which one is best suited to upload Cloud Manager images to OCI object storage.

Install OCI CLI

Using the automated CLI installer is the easiest option.  Let’s assume that we are using an on-premise Linux desktop to download images.  You can also choose to use a Microsoft Windows desktop or laptop with the same approach.  For the Microsoft Windows desktop or laptop, there will be differences in the commands you will need to run, which is clearly documented here

To upload these images, you need to set up OCI CLI on the Linux desktop where you downloaded the images.  Login to the Linux desktop and follow the instructions below to install OCI CLI.  More details on installing OCI CLI can be found in the Oracle Cloud documentation.

  1. Open a terminal.
  2. To run the installer script, run the following command (all on one line).

bash -c "$(curl -L"

  1. Respond to the installer's prompts. 

Configure OCI CLI

At this point, the OCI CLI is installed on your Linux desktop.  Now, you must configure it to work seamlessly with your OCI tenancy. 

To simplify configuration, use the setup dialog process, which walks you through the first-time setup process step-by-step. For the setup dialog process, use the oci setup config command, which prompts you for the information required for the config file and the API public/private keys. The setup dialog generates an API key pair and creates the config file. 

After setting up the OCI CLI, it is important to add the API public key to your user settings.  Log in to OCI web console and navigate to Identity > Users > <your user name> > Add Public Key.

Upload Cloud Manager Images to OCI

Now the OCI CLI setup is complete and we can use CLI to upload Cloud Manager images that were downloaded earlier. Using the OCI web console, create an object storage bucket. You can then upload the downloaded images into this bucket using OCI CLI.  The command to upload a file to an object storage bucket is as shown below.  Modify it to suit your tenancy and execute the command on the Linux desktop for each image file, to upload the downloaded images.

oci os object put  -ns <tenancy_name>  -bn <bucket_name>  --file <cm_image_file> --name <destination_file_name>  --no-multipart

Both Cloud Manager images take a while to upload.  After the upload is complete, follow the process explained in the OBE for Installing Cloud Manager in OCI to import them as custom images into your tenancy.

How to Set Up OCI Tenancy for PeopleSoft Cloud Manager – Part II

Mon, 2018-07-02 03:43

In the previous blog on setting up tenancy, you learned that a Virtual Cloud Network (VCN) and subnets must be created as a prerequisite for PeopleSoft Cloud Manager Image 6. In Oracle Cloud Infrastructure (OCI), you get the flexibility to configure networking to mimic your on premises environment.  Networking can be configured in many ways for PeopleSoft environments.  You can have separate subnets for demo, development, testing, pre-production and production environments.  You can also group environments into two subnets, one for production and another for non-production environments.  Alternatively, you can create one subnet each for database, middle tier components (Application Server, Process Scheduler, and PIA), PeopleSoft Client, Elasticsearch instances and load balancer.  Another option is to create only three subnets, one for load balancer, the second for database and the third for rest of the instance types.  Subnets can either be public or private.  You can choose to deploy all instances on a private subnet to secure them from the Internet, or put them on a public network where they are accessible from the Internet, or put a few on private and a few on public subnets.  Complete flexibility!

In this blog, let’s take an example of a simple networking architecture. This example network has one VCN with one public subnet and one private subnet.  The public subnet (red dotted line in the illustration below) hosts the Cloud Manager, File Server, middle tier, PeopleSoft (Windows) Clients and Elasticsearch instances.  Database instances are hosted in a private subnet. 


First step, create a VCN using OCI web console by selecting Networking, Virtual Cloud Networks. Let’s call it MyVCN

Note. To locate the commands mentioned here, click the Navigation Menu at the top left of the OCI console home page.

You can choose to create the default subnets by selecting the option Create Virtual Cloud Network plus related resources or select Create Virtual Cloud Network only to create your own customized subnets.  Let’s select the latter option for the purpose of this blog. In the example below, a VCN is created with a CIDR  You can use a CIDR that mimics your on premises networking too. Read more on VCN and subnet here to familiarize with the concepts and management of networks.

The next important part is creating security lists.  By default every subnet will have a security list.  You can customize the same default security list or create your own.  In the sample network illustrated above, there are two security lists – one for a database subnet and another for the middle tier subnet. The security list for the database subnet can have rules as shown below.  The first rule indicates that the database instance can be accessed over the Internet on port 22 for SSH, and the second rule indicates that all network connections arising from the middle tier subnet ( must be allowed to connect to all ports on instances running on database subnet.

Now, let’s take a look at the middle tier subnet security list. Here is the summary of the rules:

  • Rule 1 – allow all connections to port 22 (SSH access)
  • Rule 2 – allow all connections to port range 8000 – 8200 (HTTP PIA access)
  • Rule 3 – allow all connections to port range 8443 – 8643 (HTTPS PIA access)
  • Rule 4 – allow all connections between instances within middle tier subnet
  • Rule 5 – allow all connections to port 3389 (RDP to Windows Client)

Also add Egress rules for each subnet as per your requirement.  The example shown below allows connections to all destinations.

Define an internet gateway as shown below.  Also add any route tables if required. More details are available in OCI documentation here.

It is important to ensure that the Cloud Manager and the file server instance has access to all subnets on which it will provision new environments.  The NFS share in the file server on which DPKs are stored, is mounted on the instances being provisioned and hence all NFS ports must be opened in the security lists. More details about the VCN, subnets and security lists specific to Cloud Manager is available here.

All security rules in this blog are not very restrictive and serve as examples to start with.  It is always recommended to restrict access to subnets by adding more rules that allow only required ports.  To learn more about securing access refer to OCI documentation here.

Next, create subnets for each Availability Domain.  In the example below, there are two subnets created for an Availability Domain evQs US-ASHBURN-AD-1.  One subnet is used for hosting database instances and another to host the remainder of the instances, such as middle tier (running Application Server, PIA server and Process Scheduler), Windows client and Elasticsearch.  Create similar subnets on each of the availability domains.  This way, you’ll have a uniform network layout on each of the availability domains.  Ensure that you select the right security list for each subnet.

With this, the tenancy is ready for you to set up PeopleSoft Cloud Manager.  You have set up a compartment, user, policies, VCN, subnets and security lists.  Learn more about installing Cloud Manager here and how to use it here.

How to Set up OCI Tenancy for PeopleSoft Cloud Manager – Part I

Mon, 2018-07-02 02:33

In our previous blog we announced the release of the latest PeopleSoft Cloud Manager Image 6, which includes support for Oracle Cloud Infrastructure (OCI).  Cloud Manager for OCI is designed to work in a simple way.  Let’s try to understand those design aspects that must be considered for your planning.

  • Deploy all instances of a PeopleSoft environment in a single OCI Region
    Cloud Manager can deploy PeopleSoft environments in only one region, the one in which it is running.  When you sign in to Cloud Manager in a browser and access the Cloud Manager Infrastructure Settings page, you see this region listed as “Deployment Region.” When you create or configure an environment template in Cloud Manger, you will see all regions listed, but you must always specify the region chosen as Deployment Region on the Cloud Manager Infrastructure Settings page.
  • Deploy all instances of a PeopleSoft environment in a single Availability Domain (AD) in the chosen Deployment Region
    In environment templates, you can set the AD in which you want to deploy an environment.  All instances are deployed in the chosen AD.  Cloud Manager doesn’t provide the ability to choose different ADs for each instance of the same environment. For example, you will not be able to deploy a Database instance in one AD and middle tier components (Application Server, Process Scheduler, and PIA) or a PeopleSoft Client instance in another AD.
  • Deploy all instances of a PeopleSoft environment in a single compartment
    While defining environment templates you get an option to choose the compartment in which an environment will be deployed.  All instances of an environment will always be deployed in the selected compartment.  You will not be allowed to deploy instances of the same environment across two or more compartments.  For example, you cannot deploy a Database instance in one compartment and middle tier instances or a PeopleSoft Client instance on another compartment. 
  • Deploy the Cloud Manager instance, and all instances of a PeopleSoft environment, on a single VCN that is dedicated for PeopleSoft environments. 
    The Cloud Manager instance must be deployed in the same VCN as all instances of a PeopleSoft environment.  Under the dedicated VCN, you can create multiple subnets that can be used to isolate individual instances or production and non-production environments.

Let’s go through those requirements that satisfy the above mentioned design considerations.  To provision PeopleSoft environments in OCI using Cloud Manager, you’ll need to prepare your tenancy with a few pre-requisite configurations. 

Create a compartment

When you get access to your OCI tenancy, you’ll have an administrator user only.  You can find an overview about how to set up your tenancy here

Note. To locate the commands mentioned here, click the Navigation Menu at the top left of the OCI console home page.

Let’s start by creating a compartment in which all PeopleSoft environments will be provisioned.  If you want to deploy your development, test or production environments on separate compartments, then create enough compartments for the various environments.  For this blog, let’s assume we have created a compartment and named it nkpsft.

Create a cloud user for PeopleSoft

Sign in as the administrator user for OCI, and create a user for Cloud Manager with sufficient privileges to create and manage instances, the Virtual Cloud Network and subnets.  This involves multiple steps as outlined below.   

First, create a Group for Cloud Manager users; for example, CM-Admin.

Next, create a policy, for example CM-Admin-Policy, to grant access to compartment nkpsft to Cloud Manager group.

After that, create a user cloudmgr that will be used in Cloud Manager Settings.

Finally, add cloudmgr user to CM-Admin group.  With this, we have set up a compartment and a user with admin privileges to create and manage instances on OCI.

In our next blog, we’ll talk about the next important pre-requisite – Networking.  Cloud Manager Image 6 on OCI does not enforce a flat networking architecture, as it does in OCI-Classic.  Your network admin can design and lay out the required networking architecture.  You can create your own networking layout and Cloud Manager automation will use it to deploy PeopleSoft environments.  Remember, you’ll be creating networking components in the same compartment that you had created earlier, so that Cloud Manager gets access to it. 


Initial release of PeopleTools 8.57 will be in the Oracle Cloud

Thu, 2018-05-31 16:44

Oracle recently announced in a Tech Update that PeopleTools 8.57 will be generally available to all PeopleSoft customers for use first in the Oracle Cloud.  Shortly after, most likely with the third PeopleTools patch (8.57.03) it will be available for on premises environments. This, understandably, has generated some questions so I thought I take a few minutes to clear things up. 

When we talk about running PeopleSoft in the cloud, we are talking about taking one or more of your existing PeopleSoft environments and instead of running them in your data center, they are run on hardware that you subscribe to and manage in the cloud.  In this case, the cloud is Oracle’s Infrastructure as a Service, also called Oracle Cloud Infrastructure (OCI).   When you move one of your environments to the cloud, it’s backed up to files, copied to a cloud repository and provision from the repository.  We call that process ‘Lift and Shift’.  When it’s done, your application with your data, your customizations, and your configurations is running in the cloud.  Even though it’s in the cloud, you are responsible for maintaining it.  What has changed is the infrastructure – servers, storage, network – that the application is running on.

Just to be clear, there is no Software-as-a-Services (SaaS) version of PeopleSoft or PeopleTools, nor does Oracle have any plans to release one.   So, thinking it through, the 8.57 release of PeopleTools that we make available in the cloud is exactly the same as the version you will install in your on premises environment.  Why then are we releasing it first in the cloud?

There are several reasons, but the most significant is to build awareness of the benefits of running PeopleSoft applications in the Oracle Cloud, and what that can mean to you.  We believe that in the long term, the best way to run PeopleSoft applications is to do so in in the cloud.  To make it even better, one of the major initiatives over the past couple of years is the release of PeopleSoft Cloud Manager.  With Cloud Manager, many of the processes that are time consuming or difficult, particularly around lifecycle management, have been improved or automated.  The PeopleTools upgrade, for instance, is automated.  Just choose one of your application images from your cloud repository and start it with the latest Tools version and the app will be upgraded as part of the provisioning process.  It’s that easy.  And that’s just one example.

It’s pretty easy to take advantage of the initial releases of PeopleTools in the cloud.  In fact, as I write this there is a 30 day Trial Program that gives you free credits to try it out.  Be sure to follow the correct OBE when installing.  For more information go to this link or talk to your Oracle Account team.  There is also a PeopleSoft Key Concept page about running Peoplesoft on the Oracle Cloud and PeopleSoft Cloud Manager where you can get more information.  It only takes a small investment to try this out, and it could lead to major improvements in how you manage your applications.

When Screen Scraping became API calling – Gathering Oracle OpenWorld Session Catalog with ...

Sun, 2018-05-20 03:16

A dataset with all sessions of the upcoming Oracle OpenWorld 2017 conference is nice to have – for experiments and demonstrations with many technologies. The session catalog is exposed at a website here.

With searching, filtering and scrolling, all available sessions can be inspected. If data is available in a browser, it can be retrieved programmatically and persisted locally in for example a JSON document. A typical approach for this is web scraping: having a server side program act like a browser, retrieve the HTML from the web site and query the data from the response. This process is described for example in this article – – for Node and the Cheerio library.

However, server side screen scraping of HTML will only be successful when the HTML is static. Dynamic HTML is constructed in the browser by executing JavaScript code that manipulates the browser DOM. If that is the mechanism behind a web site, server side scraping is at the very least considerably more complex (as it requires the server to emulate a modern web browser to a large degree). Selenium has been used in such cases – to provide a server side, programmatically accessible browser engine. Alternatively, screen scraping can also be performed inside the browser itself – as is supported for example by the Getsy library.

As you will find in this article – when server side scraping fails, client side scraping may be a much to complex solution. It is very well possible that the rich client web application is using a REST API that provides the data as a JSON document. An API that our server side program can also easily leverage. That turned out the case for the OOW 2017 website – so instead of complex HTML parsing and server side or even client side scraping, the challenge at hand resolves to nothing more than a little bit of REST calling. Read the complete article here.

PaaS Partner Community

For regular information on business process management and integration become a member in the SOA & BPM Partner Community for registration please visit (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn image[7][2][2][2] Facebook clip_image002[8][4][2][2][2] Wiki

Technorati Tags: SOA Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress

Solve digital transformation challenges using Oracle Cloud

Sun, 2018-05-20 03:15



Digital transformation is an omnipresent topic today, providing a lot of challenges as well as chances. Due to that, customers are asking about how to deal with those challenges and how to leverage from the provided chances. Frequently asked questions in this area are:

  • How can we modernize existing applications?
  • What are the key elements for a future-proven strategy IT system architecture?
  • How can the flexibility as well as the agility of the IT system landscape be ensured?

But from our experience there’s no common answer for these questions, since every customer has individual requirements and businesses, but it is necessary to find pragmatic solutions, which leverage from existing best Practices – it is not necessary to completely re-invent the wheel.

With our new poster „Four Pillars of Digitalization based on Oracle Cloud“ (Download it here) , we try to deliver a set of harmonized reference models which we evolved based on our practical experience, while conceiving modern, future-oriented solutions in the area of modern application designs, integrative architectures, modern infrastructure solutions and analytical architectures. The guiding principle, which is the basis for our architectural thoughts is: Design for Change. If you want to learn more, you can refer to our corresponding Ebook (find the Ebook here, only available in German at the moment).

Usually the technological base for modern application architectures today is based on Cloud services, where the offerings of different vendors are constantly growing. Here it is important to know which Cloud services are the right ones to implement a specific use case. Our poster „Four Pillars of Digitalization based on Oracle Cloud“ shows the respective Cloud services of our strategic partner Oracle, which can be used to address specific challenges in the area of digitalization. Get the poster here.


Developer Partner Community

For regular information become a member in the Developer Partner Community please visit: ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn Forum Wiki

Technorati Tags: PaaS,Cloud,Middleware Update,WebLogic, WebLogic Community,Oracle,OPN,Jürgen Kress

Oracle API Platform Cloud Service Overview by Rolando Carrasco

Sat, 2018-05-19 03:25


  Oracle API Platform Cloud Services - API Design This is the first video of a series to showcase the usage of Oracle API Platform Cloud Services. API Management Part 1 of 2. Oracle API Cloud Services This is the second video of a series to show case the usage of the brand new Oracle API Platform CS. This is part one of API Management Oracle API Platform Cloud Services - API Management part 2 This is the 3rd video of the series. In specific here we will see the second part of the API Management functionality focused on Documentation. Oracle API Platform CS - How to create an app This is the 4th video of this series. In this video you will learn how to create an application. Oracle API Plaform Cloud Services - API Usage This is the fifth video of this series. In this video I will showcase how you will interact with the APIs that are deployed in APIPCS.


PaaS Partner Community

For regular information on business process management and integration become a member in the SOA & BPM Partner Community for registration please visit (OPN account required) If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn image[7][2][2][2] Facebook clip_image002[8][4][2][2][2] Wiki

Technorati Tags: SOA Community,Oracle SOA,Oracle BPM,OPN,Jürgen Kress

Why are Universal Cloud Credit and Bring Your Own License a great opportunity for Oracle Partners?

Sat, 2018-05-19 03:24

Oracle simplified buying and consuming for PaaS and IaaS Cloud. Customer can purchase now Universal Cloud Credits. This universal cloud credits can be spend for any IaaS or PaaS service. Partners can start a PoC or project e.g. with Application Container Cloud Service and can add additional service when required e.g. Chabot Cloud Service. The customer can use the universal cloud credits for any available or even upcoming IaaS and PaaS services.

Thousands of customers use Oracle Fusion Middleware and Databases today. With Bring Your Own License they can move easy workload to the cloud. As they already own the license the customer needs to pay only a small uplift for the service portion of PaaS. This is a major opportunity for Oracle partners to offer services to this customers.

To learn more about Universal Cloud Credits and Bring Your Own License Attend the free on-demand training here


Developer Partner Community

For regular information become a member in the Developer Partner Community please visit: ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center.

Blog Twitter LinkedIn Forum Wiki

Technorati Tags: PaaS,Cloud,Middleware Update,WebLogic, WebLogic Community,Oracle,OPN,Jürgen Kress

Event Hub Cloud Service. Hello world

Sat, 2018-05-19 00:46

In early days, I've wrote a blog about Oracle Reference Architecture and concept of Schema on Read and Schema on Write. Schema on Read is well suitable for Data Lake, which may ingest any data as it is, without any transformation and preserve it for a long period of time. 

At the same time you have two types of data - Streaming Data and Batch. Batch could be log files, RDBMS archives. Streaming data could be IoT, Sensors, Golden Gate replication logs.

Apache Kafka is very popular engine for acquiring streaming data. It has multiple advantages, like scalability, fault tolerance and high throughput. Unfortunately, Kafka is hard to manage. Fortunately, Cloud simplifies many routine operations. Oracle Has three options for deploy Kafka in the Cloud:

1) Use Big Data Cloud Service, where you get full Cloudera cluster and there you could deploy Apache Kafka as part of CDH.

2) Event Hub Cloud Service Dedicated. Here you have to specify server shapes and some other parameters, but rest done by Cloud automagically. 

3) Event Hub Cloud Service. This service is fully managed by Oracle, you even don't need to specify any compute shapes or so. Only one thing to do is tell for how long you need to store data in this topic and tell how many partitions do you need (partitions = performance).

Today, I'm going to tell you about last option, which is fully managed cloud service.

It's really easy to provision it, just need to login into your Cloud account and choose "Event Hub" Cloud service.

after this go and choose open service console:

Next, click on "Create service":

Put some parameters - two key is Retention period and Number of partitions. First defines for how long will you store messages, second defines performance for read and write operations.

Click next after:

Confirm and wait a while (usually not more than few minutes):

after a short while, you will be able to see provisioned service:



Hello world flow.

Today I want to show "Hello world" flow. How to produce (write) and consume (read) message from Event Hub Cloud Service.

The flow is (step by step):

1) Obtain OAuth token

2) Produce message to a topic

3) Create consumer group

4) Subscribe to topic

5) Consume message

Now I'm going to show it in some details.

OAuth and Authentication token (Step 1)

For dealing with Event Hub Cloud Service you have to be familiar with concept of OAuth and OpenID. If you are not familiar, you could watch the short video or go through this step by step tutorial

In couple words OAuth token authorization (tells what I could access) method to restrict access to some resources.

One of the main idea is decouple Uses (real human - Resource Owner) and Application (Client). Real man knows login and password, but Client (Application) will not use it every time when need to reach Resource Server (which has some info or content). Instead of this, Application will get once a Authorization token and will use it for working with Resource Server. This is brief, here you may find more detailed explanation what is OAuth.

Obtain Token for Event Hub Cloud Service client.

As you could understand for get acsess to Resource Server (read as Event Hub messages) you need to obtain authorization token from Authorization Server (read as IDCS). Here, I'd like to show step by step flow how to obtain this token. I will start from the end and will show the command (REST call), which you have to run to get token:

#!/bin/bash curl -k -X POST -u "$CLIENT_ID:$CLIENT_SECRET" \ -d "grant_type=password&username=$THEUSERNAME&password=$THEPASSWORD&scope=$THESCOPE" \ "$IDCS_URL/oauth2/v1/token" \ -o access_token.json

as you can see there are many parameters required for obtain OAuth token.

Let's take a looks there you may get it. Go to the service and click on topic which you want to work with, there you will find IDCS Application, click on it:

After clicking on it, you will go be redirected to IDCS Application page. Most of the credentials you could find here. Click on Configuration:

On this page right away you will find ClientID and Client Secret (think of it like login and password):


look down and find point, called Resources:

Click on it

and you will find another two variables, which you need for OAuth token - Scope and Primary Audience.

One more required parameter - IDCS_URL, you may find in your browser:

you have almost everything you need, except login and password. Here implies oracle cloud login and password (it what you are using when login into

Now you have all required credential and you are ready to write some script, which will automate all this stuff:

#!/bin/bash export CLIENT_ID=7EA06D3A99D944A5ADCE6C64CCF5C2AC_APPID export CLIENT_SECRET=0380f967-98d4-45e9-8f9a-45100f4638b2 export THEUSERNAME=john.dunbar export THEPASSWORD=MyPassword export SCOPE=/idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest export PRIMARY_AUDIENCE= export THESCOPE=$PRIMARY_AUDIENCE$SCOPE export IDCS_URL= curl -k -X POST -u "$CLIENT_ID:$CLIENT_SECRET" \ -d "grant_type=password&username=$THEUSERNAME&password=$THEPASSWORD&scope=$THESCOPE" \ "$IDCS_URL/oauth2/v1/token" \ -o access_token.json

after running this script, you will have new file called access_token.json. Field access_token it's what you need:

$ cat access_token.json {"access_token":"eyJ4NXQjUzI1NiI6InVUMy1YczRNZVZUZFhGbXFQX19GMFJsYmtoQjdCbXJBc3FtV2V4U2NQM3MiLCJ4NXQiOiJhQ25HQUpFSFdZdU9tQWhUMWR1dmFBVmpmd0UiLCJraWQiOiJTSUdOSU5HX0tFWSIsImFsZyI6IlJTMjU2In0.eyJ1c2VyX3R6IjoiQW1lcmljYVwvQ2hpY2FnbyIsInN1YiI6ImpvaG4uZHVuYmFyIiwidXNlcl9sb2NhbGUiOiJlbiIsInVzZXJfZGlzcGxheW5hbWUiOiJKb2huIER1bmJhciIsInVzZXIudGVuYW50Lm5hbWUiOiJpZGNzLTFkNmNjN2RhZTQ1YjQwYTFiOWVmNDJjNzYwOGI5YWZlIiwic3ViX21hcHBpbmdhdHRyIjoidXNlck5hbWUiLCJpc3MiOiJodHRwczpcL1wvaWRlbnRpdHkub3JhY2xlY2xvdWQuY29tXC8iLCJ0b2tfdHlwZSI6IkFUIiwidXNlcl90ZW5hbnRuYW1lIjoiaWRjcy0xZDZjYzdkYWU0NWI0MGExYjllZjQyYzc2MDhiOWFmZSIsImNsaWVudF9pZCI6IjdFQTA2RDNBOTlEOTQ0QTVBRENFNkM2NENDRjVDMkFDX0FQUElEIiwiYXVkIjpbInVybjpvcGM6bGJhYXM6bG9naWNhbGd1aWQ9N0VBMDZEM0E5OUQ5NDRBNUFEQ0U2QzY0Q0NGNUMyQUMiLCJodHRwczpcL1wvN0VBMDZEM0E5OUQ5NDRBNUFEQ0U2QzY0Q0NGNUMyQUMudXNjb20tY2VudHJhbC0xLm9yYWNsZWNsb3VkLmNvbTo0NDMiXSwidXNlcl9pZCI6IjM1Yzk2YWUyNTZjOTRhNTQ5ZWU0NWUyMDJjZThlY2IxIiwic3ViX3R5cGUiOiJ1c2VyIiwic2NvcGUiOiJcL2lkY3MtMWQ2Y2M3ZGFlNDViNDBhMWI5ZWY0MmM3NjA4YjlhZmUtb2VodGVzdCIsImNsaWVudF90ZW5hbnRuYW1lIjoiaWRjcy0xZDZjYzdkYWU0NWI0MGExYjllZjQyYzc2MDhiOWFmZSIsInVzZXJfbGFuZyI6ImVuIiwiZXhwIjoxNTI3Mjk5NjUyLCJpYXQiOjE1MjY2OTQ4NTIsImNsaWVudF9ndWlkIjoiZGVjN2E4ZGRhM2I4NDA1MDgzMjE4NWQ1MzZkNDdjYTAiLCJjbGllbnRfbmFtZSI6Ik9FSENTX29laHRlc3QiLCJ0ZW5hbnQiOiJpZGNzLTFkNmNjN2RhZTQ1YjQwYTFiOWVmNDJjNzYwOGI5YWZlIiwianRpIjoiMDkwYWI4ZGYtNjA0NC00OWRlLWFjMTEtOGE5ODIzYTEyNjI5In0.aNDRIM5Gv_fx8EZ54u4AXVNG9B_F8MuyXjQR-vdyHDyRFxTefwlR3gRsnpf0GwHPSJfZb56wEwOVLraRXz1vPHc7Gzk97tdYZ-Mrv7NjoLoxqQj-uGxwAvU3m8_T3ilHthvQ4t9tXPB5o7xPII-BoWa-CF4QC8480ThrBwbl1emTDtEpR9-4z4mm1Ps-rJ9L3BItGXWzNZ6PiNdVbuxCQaboWMQXJM9bSgTmWbAYURwqoyeD9gMw2JkwgNMSmljRnJ_yGRv5KAsaRguqyV-x-lyE9PyW9SiG4rM47t-lY-okMxzchDm8nco84J5XlpKp98kMcg65Ql5Y3TVYGNhTEg","token_type":"Bearer","expires_in":604800}

Create Linux variable for it:

#!/bin/bash export TOKEN=`cat access_token.json |jq .access_token|sed 's/\"//g'`

Well, now we have Authorization token and may work with our Resource Server (Event Hub Cloud Service). 

Note: you also may check documentation about how to obtain OAuth token.

Produce Messages (Write data) to Kafka (Step 2)

The first thing that we may want to do is produce messages (write data to a Kafka cluster). To make scripting easier, it's also better to use some environment variables for common resources. For this example, I'd recommend to parametrize topic's end point, topic name, type of content to be accepted and content type. Content type is completely up to developer, but you have to consume (read) the same format as you produce(write). The key parameter to define is REST endpoint. Go to PSM, click on topic name and copy everything till "restproxy":

Also, you will need topic name, which you could take from the same window:

now we could write a simple script for produce one message to Kafka:

#!/bin/bash export OEHCS_ENDPOINT= export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest export CONTENT_TYPE=application/vnd.kafka.json.v2+json curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: $CONTENT_TYPE" \ --data '{"records":[{"value":{"foo":"bar"}}]}' \ $OEHCS_ENDPOINT/topics/$TOPIC_NAME

if everything will be fine, Linux console will return something like:


Create Consumer Group (Step 3)

The first step to read data from OEHCS is create consumer group. We will reuse environment variables from previous step, but just in case I'll include it in this script:

#!/bin/bash export OEHCS_ENDPOINT= export CONTENT_TYPE=application/vnd.kafka.json.v2+json export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: $CONTENT_TYPE" \ --data '{"format": "json", "auto.offset.reset": "earliest"}' \ $OEHCS_ENDPOINT/consumers/oehcs-consumer-group \ -o consumer_group.json

this script will generate output file, which will contain variables, that we will need to consume messages.

Subscribe to a topic (Step 4)

Now you are ready to subscribe for this topic (export environment variable if you didn't do this before):

#!/bin/bash export BASE_URI=`cat consumer_group.json |jq .base_uri|sed 's/\"//g'` export TOPIC_NAME=idcs-1d6cc7dae45b40a1b9ef42c7608b9afe-oehtest curl -X POST \ -H "Authorization: Bearer $TOKEN" \ -H "Content-Type: $CONTENT_TYPE" \ -d "{\"topics\": [\"$TOPIC_NAME\"]}" \ $BASE_URI/subscription

If everything fine, this request will not return something. 

Consume (Read) messages (Step 5)

Finally, we approach last step - consuming messages.

and again, it's quite simple curl request:

#!/bin/bash export BASE_URI=`cat consumer_group.json |jq .base_uri|sed 's/\"//g'` export H_ACCEPT=application/vnd.kafka.json.v2+json curl -X GET \ -H "Authorization: Bearer $TOKEN" \ -H "Accept: $H_ACCEPT" \ $BASE_URI/records

if everything works, like it supposed to work, you will have output like:



Today we saw how easy to create fully managed Kafka Topic in Event Hub Cloud Service and also we made a first steps into it - write and read message. Kafka is really popular message bus engine, but it's hard to manage. Cloud simplifies this and allow customers concentrate on the development of their applications.

here I also want to give some useful links:

1) If you are not familiar with REST API, I'd recommend you to go through this blog

2) There is online tool, which helps to validate your curl requests

3) Here you could find some useful examples of producing and consuming messages

4) If you are not familiar with OAuth, here is nice tutorial, which show end to end example

Why Now Is the Time for ERP in the Cloud

Fri, 2018-05-18 20:20

“The movement to cloud is an inevitable destination; this is how computing will evolve over the next several years.” So said Oracle CEO Mark Hurd at Oracle OpenWorld 2017. Based on the results of new research, that inevitability is here, now.

In our first ERP Trends Report, we surveyed more than 400 finance and IT leaders. We found that 76% of respondents said they either have plans for ERP in the cloud or have made the move already. They are recognizing that waiting puts them at a disadvantage; the time to make the move is now.

The majority of respondents cited economic factors as the reason they made the leap, and it’s easy to see why: Nucleus Research recently published a report that cloud delivers 3.2x the return on investment (ROI) of on-premises systems, while the total cost of ownership (TCO) is 52% lower.  

But even more surprising were the benefits realized once our survey respondents got to the cloud. An astonishing 81% cited “Staying current on technology” as the main benefit of moving to cloud ERP. With a regular cadence of innovation delivered by the cloud, it is easier for companies to quickly incorporate game-changing technologies into everyday business processes—technologies like artificial intelligence, machine learning, the Internet of Things (IoT), blockchain and more. In the cloud, the risk of running their businesses on obsolete technology drops to zero. It’s the last upgrade they will ever need.

“One of the key value propositions in engaging with Oracle and implementing the cloud solutions has been the value of keeping current with technology and technological developments,” said Mick Murray, CFO of Blue Shield of California. “In addition to robotics, we’re looking at machine learning and artificial intelligence, and how do we apply that across the enterprise.”

As new capabilities are rolled out, cloud subscribers like Blue Shield can take advantage of them immediately. This gives them the agility to be both responsive and predictive. Uncertainty is the new normal in business and managing amid uncertainty is a must. It’s no longer enough to be quick-to-change; competitive companies must also have reliable insight into how potential future scenarios could impact performance.

So, what does that mean in terms of daily operations? Basically, it means people using knowledge to make good decisions in a fast, productive, and highly automated manner at all levels of the business. Cloud systems provide the data integration and ongoing technology refresh to incorporate best practices and technology advances.

The cloud also makes it easier to integrate external sources of valuable, contextual knowledge that helps improve the accuracy of data models. This is important considering the scope of threats to sustainable operations for businesses with large, global footprints. Political, environmental, and economic factors across multiple regions could impact business, such as limited travel capabilities slowing down delivery of key supplies.

Business uncertainty is everywhere, and organizations must be able to say, “What is our plan if X happens? What is our plan if X, Y, and Z happen, but W doesn’t?” And this insight must come quickly. Business moves too fast for reports to take days to compile.

ERP Replacement Effort Is Not What It Used to Be

One final stone on the scale in favor of ERP cloud is that migrating does not have to be painful. Don’t let memories of past onsite replacements haunt you. With the right products and the right expertise behind them, cloud migrations happen quickly, cause minimal business disruption, and don’t require intense user training.

For example, Blue Shield of California had set aside $600,000 on change management for the adoption of cloud; in the end, they barely spent anything. Change adoption, they reported, happened quickly and seamlessly.

Considering the benefits for cost savings, elimination of technology obsolescence, and ease of adopting emerging technologies, it is becoming harder to justify a wait on migration to cloud ERP. Disruption is not an issue, and long-term cost saving are substantial. Most importantly, modernizing ERP is an opportunity to modernize the business and embed an ever-refreshing technology infrastructure that enables higher performance on multiple levels.


7 Machine Learning Best Practices

Fri, 2018-05-18 20:11

Netflix’s famous algorithm challenge awarded a million dollars to the best algorithm for predicting user ratings for films. But did you know that the winning algorithm was never implemented into a functional model?

Netflix reported that the results of the algorithm just didn’t seem to justify the engineering effort needed to bring them to a production environment. That’s one of the big problems with machine learning.

At your company, you can create the most elegant machine learning model anyone has ever seen. It just won’t matter if you never deploy and operationalize it. That's no easy feat, which is why we're presenting you with seven machine learning best practices.

Download your free ebook, "Demystifying Machine Learning"

At the most recent Data and Analytics Summit, we caught up with Charlie Berger, Senior Director of Product Management for Data Mining and Advanced Analytics to find out more. This is article is based on what he had to say. 

Putting your model into practice might longer than you think. A TDWI report found that 28% of respondents took three to five months to put their model into operational use. And almost 15% needed longer than nine months.

Graph on Machine Learning Operational Use

So what can you do to start deploying your machine learning faster?

We’ve laid out our tips here:

1. Don’t Forget to Actually Get Started

In the following points, we’re going to give you a list of different ways to ensure your machine learning models are used in the best way. But we’re starting out with the most important point of all.

The truth is that at this point in machine learning, many people never get started at all. This happens for many reasons. The technology is complicated, the buy-in perhaps isn’t there, or people are just trying too hard to get everything e-x-a-c-t-l-y right. So here’s Charlie’s recommendation:

Get started, even if you know that you’ll have to rebuild the model once a month. The learning you gain from this will be invaluable.

2. Start with a Business Problem Statement and Establish the Right Success Metrics

Starting with a business problem is a common machine learning best practice. But it’s common precisely because it’s so essential and yet many people de-prioritize it.

Think about this quote, “If I had an hour to solve a problem, I’d spend 55 minutes thinking about the problem and 5 minutes thinking about solutions.”

Now be sure that you’re applying it to your machine learning scenarios. Below, we have a list of poorly defined problem statements and examples of ways to define them in a more specific way.

Machine Learning Problem Statements

Think about what your definition of profitability is. For example, we recently talked to a nation-wide chain of fast-casual restaurants that wanted to look at increasing their soft drinks sales. In that case, we had to consider carefully the implications of defining the basket. Is the transaction a single meal, or six meals for a family? This matters because it affects how you will display the results. You’ll have to think about how to approach the problem and ultimately operationalize it.

Beyond establishing success metrics, you need to establish the right ones. Metrics will help you establish progress, but does improving the metric actually improve the end user experience? For example, your traditional accuracy measures might encompass precision and square error. But if you’re trying to create a model that measures price optimization for airlines, that doesn’t matter if your cost per purchase and overall purchases isn’t going up.

3. Don’t Move Your Data – Move the Algorithms

The Achilles heel in predictive modeling is that it’s a 2-step process. First you build the model, generally on sample data that can run in numbers ranging from the hundreds to the millions. And then, once the predictive model is built, data scientists have to apply it. However, much of that data resides in a database somewhere.

Let’s say you want data on all of the people in the US. There are 360 million people in the US—where does that data reside? Probably in a database somewhere.

Where does your predictive model reside?

What usually happens is that people will take all of their data out of database so they can run their equations with their model. Then they’ll have to import the results back into the database to make those predictions. And that process takes hours and hours and days and days, thus reducing the efficacy of the models you’ve built.

However, growing your equations from inside the database has significant advantages. Running the equations through the kernel of the database takes a few seconds, versus the hours it would take to export your data. Then, the database can do all of your math too and build it inside the database. This means one world for the data scientist and the database administrator.

By keeping your data within your database and Hadoop or object storage, you can build models and score within the database, and use R packages with data-parallel invocations. This allows you to eliminate data duplications and separate analytical servers (by not moving data) and allows you to to score models, embed data prep, build models, and prepare data in just hours.

4. Assemble the Right Data

As James Taylor with Neil Raden wrote in Smart Enough Systems, cataloging everything you have and deciding what data is important is the wrong way to go about things. The right way is to work backward from the solution, define the problem explicitly, and map out the data needed to populate the investigation and models.

And then, it’s time for some collaboration with other teams.

Machine Learning Collaboration Teams

Here’s where you can potentially start to get bogged down. So we will refer to point number 1, which says, “Don’t forget to actually get started.” At the same time, assembling the right data is very important to your success.

For you to figure out the right data to use to populate your investigation and models, you will want to talk to people in the three major areas of business domain, information technology, and data analysts.

Business domain—these are the people who know the business.

  • Marketing and sales
  • Customer service
  • Operations

Information technology—the people who have access to data.

  • Database administrators

Data Analysts—people who know the business.

  • Statisticians
  • Data miners
  • Data scientists

You need the active participation. Without it, you’ll get comments like:

  • These leads are no good
  • That data is old
  • This model isn’t accurate enough
  • Why didn’t you use this data?

You’ve heard it all before.

5. Create New Derived Variables

You may think, I have all this data already at my fingertips. What more do I need?

But creating new derived variables can help you gain much more insightful information. For example, you might be trying to predict the amount of newspapers and magazines sold the next day. Here’s the information you already have:

  • Brick-and-mortar store or kiosk
  • Sell lottery tickets?
  • Amount of the current lottery prize

Sure, you can make a guess based off that information. But if you’re able to first compare the amount of the current lottery prize versus the typical prize amounts, and then compare that derived variable against the variables you already have, you’ll have a much more accurate answer.

6. Consider the Issues and Test Before Launch

Ideally, you should be able to A/B test with two or more models when you start out. Not only will you know how you’re doing it right, but you’ll also be able to feel more confident knowing that you’re doing it right.

But going further than thorough testing, you should also have a plan in place for when things go wrong. For example, your metrics start dropping. There are several things that will go into this. You’ll need an alert of some sort to ensure that this can be looked into ASAP. And when a VP comes into your office wanting to know what happened, you’re going to have to explain what happened to someone who likely doesn’t have an engineering background.

Then of course, there are the issues you need to plan for before launch. Complying with regulations is one of them. For example, let’s say you’re applying for an auto loan and are denied credit. Under the new regulations of GDPR, you have the right to know why. Of course, one of the problems with machine learning is that it can seem like a black box and even the engineers/data scientists can’t say why certain decisions have been made. However, certain companies will help you by ensuring your algorithms will give a prediction detail.

7. Deploy and Automate Enterprise-Wide

Once you deploy, it’s best to go beyond the data analyst or data scientist.

What we mean by that is, always, always think about how you can distribute predictions and actionable insights throughout the enterprise. It’s where the data is and when it’s available that makes it valuable; not the fact that it exists. You don’t want to be the one sitting in the ivory tower, occasionally sprinkling insights. You want to be everywhere, with everyone asking for more insights—in short, you want to make sure you’re indispensable and extremely valuable.

Given that we all only have so much time, it’s easiest if you can automate this. Create dashboards. Incorporate these insights into enterprise applications. See if you can become a part of customer touch points, like an ATM recognizing that a customer regularly withdraws $100 every Friday night and likes $500 after every payday.


Here are the core ingredients of good machine learning. You need good data, or you’re nowhere. You need to put it somewhere like a database or object storage. You need deep knowledge of the data and what to do with it, whether it’s creating new derived variables or the right algorithms to make use of them. Then you need to actually put them to work and get great insights and spread them across the information.

The hardest part of this is launching your machine learning project. We hope that by creating this article, we’ve helped you out with the steps to success. If you have any other questions or you’d like to see our machine learning software, feel free to contact us.

You can also refer back to some of the articles we’ve created on machine learning best practices and challenges concerning that. Or, download your free ebook, "Demystifying Machine Learning."


Announcing PeopleSoft Cloud Manager Support for Oracle Cloud Infrastructure

Fri, 2018-05-18 19:45

Oracle released PeopleSoft Cloud Manager in 2017 featuring in-depth automation to help accelerate adoption of Oracle Cloud (Classic) as an efficient deployment platform for PeopleSoft customers. With the excitement generated around Oracle Cloud Infrastructure (OCI)--a cloud designed for the enterprise customer--several customers and partners have been looking forward to taking advantage of the enhanced OCI with PeopleSoft Cloud Manager.  Oracle is pleased to announce Cloud Manager’s support for OCI beginning with today’s release of PeopleSoft Cloud Manager Version 6.

So, what is new and exciting in PeopleSoft Cloud Manager Version 6?  For the first time, there are two images provided: one for OCI, and the other for OCI Classic.  The Cloud Manager Image 6 for OCI supports a number of OCI features, including Regions, Virtual Cloud Networks, Subnets, Compute and DB System platforms.  With this image, instances will be provisioned on VM shapes.  Customers can lift and shift PeopleSoft environments from on-premises to OCI using the same approach they used OCI Classic.

For PeopleSoft Cloud Manager on OCI Classic, we have enabled support for the lift and shift of on-premises databases encrypted with Oracle Transparent Data Encryption (TDE).  TDE offers another level of data security that customers are looking for as data is migrated to the cloud.  A ‘Clone to template’ option is also available for encrypted databases. 

The lift utility requires a few parameters for TDE so that the encrypted database may be packaged and lifted to the cloud.

During shift process, the same parameters are required to deploy the lifted database.

Customers have also requested an enhancement to support non-Unicode databases for PeopleSoft environments.  PeopleSoft Cloud Manager Version 6 supports lift and shift of environments that use non-Unicode Databases.  Unlike image 5, a conversion of the on-premises database to Unicode is no longer required prior to using Cloud Manager’s Lift and Shift automation.

To get your hands on the new Cloud Manager images, go to the Oracle Marketplace and look for either the OCI-Classic image or the OCI image…or try both!   Be sure to review the documentation and additional important information mentioned in the Marketplace listings.

We are excited to combine the automation of provisioning and maintenance that PeopleSoft Cloud Manager provides with the robust benefits of Oracle Cloud Infrastructure.  Combining support for OCI with the additional features of non-Unicode databases and TDE encrypted databases, we expect all customers to benefit from the latest Cloud Manager image, using whichever Oracle Cloud is right for you. 

Stay tuned for additional information and more Cloud Manager features.  Now, off to the next image!


Emerging Tech Helps Progressive Companies Deliver Exceptional CX

Fri, 2018-05-18 19:18

It’s no secret that the art of delivering exceptional service to customers—whether they’re consumers or business buyers—is undergoing dramatic change. Customers routinely expect highly personalized experiences across all touchpoints, from marketing and sales to service and support. I call each of these engagements a moment of truth—because leaving customers feeling satisfied and valued at each touchpoint will have a direct bearing on their loyalty and future spending decisions.

This is why customer experience (CX) has become a strategic business imperative for modern companies. Organizations that provide effective, well-integrated CX across the entire customer journey achieved compound annual growth rates of 17%, versus the 3% growth rates logged by their peers who provided less-effective customer experiences, according to Forrester’s 2017 “Customer Experience Index.”

Fortunately, it’s becoming easier to enter the CX winner’s circle. AI, machine learning, IoT, behavioral analytics, and other innovations are helping progressive companies capitalize on internal and third-party data to deliver highly personalized communications, promotional offers, and service engagements.

How can companies fully leverage today’s tools to support exceptional CX? If they haven’t already done so, companies should start evolving away from cloud 1.0 infrastructures, where an amalgam of best-of-breed services runs various business units. These standalone cloud platforms might have initially provided quick on-ramps to modern capabilities, but now, many companies are paying a price for that expediency. Siloed data and workflows hinder the smooth sharing of customer information among departments. This hurts CX when a consumer who just purchased a high-end digital camera at a retail outlet, for example, webchats with that same company’s service department about a problem, and the service team has no idea this is a premium customer.

In contrast, cloud 2.0 is focused on achieving a holistic view of customers—thanks to simplified, well-integrated services that support each phase of the customer journey. Eliminating information silos benefits companies by giving employees all the information they need to provide a tailored experience for every customer.

Achieving modern CX requires the right vendor partnerships. That starts with evaluating cloud services according to how complete, integrated, and extensible the CX platform is for supporting the entire customer journey. One option is the Oracle Customer Experience Cloud (Oracle CX Cloud) suite, an integrated set of applications for the entire customer lifecycle. It’s complemented by native AI capabilities and Oracle Data Cloud, the world’s largest third-party data marketplace of consumer and business information, which manages anonymized information from more than a billion business and 5 billion consumer identifiers. This means that business leaders, besides understanding customers based on their direct interactions, can use Oracle Data Cloud for insights into social, web surfing, and buying habits at third-party sites and retailers and then apply AI to find profitable synergies.

As new disruptive technologies come to the market—whether that’s the mainstreaming of IoT or drones for business—companies will be under constant pressure to integrate these new capabilities to improve their CX strategies. Modern, integrated cloud services designed for CX don’t support just today’s innovations. With the right cloud choices, companies can continually evolve to meet tomorrow’s CX challenges.

(Photo of Des Cahill by Bob Adler, The Verbatim Agency)