Tuesday, December 3, 2024

How to Implement Regular and Context Sensitive Descriptive Flexfields (DFFs) in Oracle Fusion Cloud

Descriptive Flexfields (DFFs) in Oracle Fusion Cloud are a powerful feature that allows users to extend ERP Cloud by adding custom fields to capture additional information without altering the underlying application framework. They are designed to be flexible and can be configured without programming changes.

Key Features of DFFs in Fusion Cloud:

Customizability: Add fields to capture additional data relevant to your business processes.

Flexibility: Enable or disable fields based on specific criteria.

Non-Technical approach: Configure fields through the application’s user interface without doing much technical work.

Context Sensitivity: Create context-sensitive fields that display based on specific conditions.

Segmentation: Define multiple fields within a single DFF.


Let's see the steps to configure DFF in Oracle ERP Cloud.

- Navigate to Setup and Maintenance



- Search for 'Manage Descriptive Flexfields'


- Let's take an example of Invoices. Search for Invoices.




- Click on Edit

- Under Global Segments section, Click + sign to configure a new field



- Let's create a new field using ATTRIBUTE1 and name it My Description. I'm using 30 Characters valueset as I want it to be a free text.

Let's provide relevant values for Prompt and Display Type.

Enable BI Enabled checkbox if you wish to enable it for use in Business Intelligence reports.




- Save and Close

- Let's add two more fields by following the above process -

My Date:



DS Comments:



- By now, out Global Segments should look like this -



- At this moment, we have configured the regular DFFs using Global Segments for Invoice form.

Configuring Context Sensitive DFFs in Oracle Oracle Fusion Cloud:

Context sensitive DFF lets us use the same underlying database column by different segments in different contexts.

For example, we can define a Location context that uses the following attributes:

ATTRIBUTE1 column for city

ATTRIBUTE2 column for state

ATTRIBUTE3 column for zip

We can also define a Contact context that uses the same columns for other attributes:

ATTRIBUTE1 column for email

ATTRIBUTE2 column for phone

ATTRIBUTE3 column for fax


Let's see how to configure Context Sensitive DFFs in Oracle Fusion

- Click on Manage Contexts button


- Click + Icon

- Create a new Context named My Group 1

- Check the Enabled checkbox



- Similarly, create another context named My Group 2



- Now, let's create three Segments under My Group 1 Context as shown below:



- And similarly, create two different Segments under My Group 2 Context as shown below:


- At this moment, we have created two contexts and configured different Segments under each context.


- Now, we need to deploy our DFF.


- Click on Deploy Flexfield button



- Once successfully deployed, click OK to close the popup



- Now, it's time to verify how our DFF segments appear on Invoices screen

- Navigate to Payables and Create a new Invoice

- Click Show More and navigate to Additional Information tab




- We can see our Global Segments as shown below -



- Now let's navigate to My Group drop-down. This is our Context Sensitive DFF.

- Let's select My Group 1, and we will see the segments pertaining to this specific context are shown: 



- Let's select context My Group 2 and we will see the segments pertaining to this context are shown:




Conclusion:

This way we can configure the regular (Global) as well as Context Specific DFFs in Oracle Fusion. DFFs help us configure additional information in ERP without the need of any technical configurations and/or coding. With Context Sensitive DFFs, we can use the same set of ATTRIBUTE columns in multiple contexts facilitating capturing of variety of additional information required for business.


Share:

Monday, September 23, 2024

How to Configure Spatial Geometry Objects in Geolocation Maps in Oracle APEX

In this post, we saw how to implement Geolocation Maps in Oracle Apex. In many use cases, you may need to draw certain features using coordinate and geometry information. Spatial Geometry objects can be used in such cases to make the geoprocessing easier.


Spatial geometry objects are drawn on the background map based on data in a table or returned by our query. Apex Maps support a rich set of built-in marker icons and shapes. Below are the supported spatial geometry objects in Apex Maps Region:


Points - Points display as markers e.g. user's current location or any specific point on the map.

Lines - Lines represent features like roads or it can be line drawn between variety of points.

Polygons - Polygons represent areas like states or countries or it can be drawn to show any other areas on the map.

Heat Map - Heat Maps are used to visualize the point density.

Extruded Polygons - These are displayed as three-dimensional, extruded Polygons. The height of the 3D object is based on a Numeric column value returned by our query.


Spatial geometry objects can be based on below sources:

Geometry Column - Supported datatypes include SDO_GEOMETRY, VARCHAR2, or CLOB. VARCHAR2 and CLOB columns must contain geometry information in GeoJSON format.

Numeric Columns - Columns representing longitude and latitude values but this option applies only to Point and Heat Map objects.


Let's take a took at each of these Spatial Geometry Objects and how we can use them in Apex Maps Region.



1. Points:

- Let's create a Layer in our Map

Layer Type: Points

As we learned above, Points layer can be based on Numeric values one for Latitude and other for Longitude. So let's use a simple query with sample values indicating a specific point on the map.


SQL Query:

SELECT TO_NUMBER('40.79261211884605') P_LAT 
      ,TO_NUMBER('-73.95252971760179') P_LONG
FROM DUAL




- Let's specify fa-location-arrow in Icon CSS Classes under Point Objects section.




- Run the App.


- As we can see the Point is denoted by an Arrow at the exact point based on Latitude and Longitude.






2. Lines:

- Let's create a new Layer in our Map

Layer Type: Lines

Lines layer needs GeoJSON format as the source. So let's build a query to generate data in GeoJSON format.
This will consist of any two points on the map (from and to) and their corresponding Latitude and Longitude values.


SQL Query:

SELECT 
'
{
"type": "LineString",
"coordinates": [
[
-73.95252971760179,
40.79261211884605
],
[
-73.95345327758079,
40.791114880267148
]
]
}
' geojson_val
FROM DUAL




- Let's specify the line properties:

Stroke Style: Dotted

Stroke Color: Red

Stroke Width: 7





- Run the App.


- As we can see a dotted Line is drawn over our map in Red color based on our coordinates.

We can also see a new check-box has been added on the map indicating controls for Line layer.








3. Polygons:

- Let's create a new Layer in our Map

Layer Type: Polygons

Polygons layer needs GeoJSON format as the source. So let's build a query to generate data in GeoJSON format.
This will consist of a number of points on the map (forming a Polygonal shape) and their corresponding Latitude and Longitude values. I'm forming the Polygon using 5 points on the map.


SQL Query:

SELECT 
'
{
"type": "Polygon",
"coordinates": [
[
[
-73.9684379656287,
40.78569943754278
],
[
-73.9629016692047,
40.78849384000406
],
[
-73.95766551865343,
40.789760807246765
],
[
-73.95895386999608,
40.783522052350584
]
,
[
-73.96569143411632,
40.7829049598581
]
,
[
-73.9684379656287,
40.78569943754278
]
]
]
}
' geojson_val
FROM DUAL







- Let's specify the Appearance properties:

Fill Color: Grey

Fill Opacity: 0.5

Stroke Color: Red







- Run the App.


- As we can see a Semi-Transparent Polygon drawn over our map in Red color based on our coordinates.

We can also see a new check-box has been added on the map indicating controls for Polygon layer.






4. Extruded Polygons:

- Let's create a new Layer in our Map

Layer Type: Extruded Polygons

Polygons layer needs GeoJSON format as the source. So let's build a query to generate data in GeoJSON format.
This will consist of a number of points on the map (forming a Polygonal shape) and their corresponding Latitude and Longitude values. I'm forming the Polygon using 4 points on the map to show a building in 3D format.

Here' we can see the query also has another column E_VAL with value 70 . This indicates the Extrusion Value for this layer which is basically the value to determine the height of the extruded polygon.


SQL Query:

SELECT 
'
{
"type": "Polygon",
"coordinates": [
[
[
-73.95243322590251,
40.79222980941566
],
[
-73.95211869237345,
40.79209869069768
],
[
-73.95250182333042,
40.79162315684402
],
[
-73.95284231859179,
40.791672689498604
]
,
[
-73.95243322590251,
40.79222980941566
]
]
]
}
' geojson_val
,70 E_VAL
FROM DUAL






- Let's specify the Appearance properties:

Fill Color: Lavender

Extrusion Value Column: E_VAL (This is the column returned by our query)

Unit: Meter

Fill Opacity: 0.5







- Run the App.


- As we can see a Semi-Transparent Polygon drawn over a building in Lavender color based on our coordinates.

We can also see a new check-box has been added on the map indicating controls for Extruded Polygon layer.






This is how we can leverage various Spatial Geometry Objects in Oracle Apex Map Region to draw a variety of objects over our Geolocation Maps.





Share:

Friday, August 9, 2024

Understanding the Data Flow Process in Oracle Cloud Infrastructure (OCI) Data Integration

In today's data-driven world businesses need robust solutions to manage, process, and analyze vast amounts of data efficiently. Oracle Cloud Infrastructure (OCI) Data Integration is a powerful service designed to address these needs. With OCI Data Integration, (OCI-DI) organizations can seamlessly integrate data across various sources, load and transform it into meaningful insights, and drive informed decision-making.

OCI-DI offers a streamlined approach to Extracting, Transforming and Loading data across various sources. This involves several steps, each designed to ensure data is accurately extracted, transformed, and loaded into the desired destination.

Let's take a look into the data flow process in OCI-DI.


1. Data Extraction

The data flow process begins with the extraction of data from various sources. This step involves identifying and accessing the relevant data sources, which could include:

- Databases: Oracle databases, third-party databases (such as MySQL, PostgreSQL, SQL Server, Snowflake), and other on-premises or cloud databases.

- SaaS Applications: Data from various Software as a Service (SaaS) applications, such as Oracle Fusion Cloud (BICC and BI Server), Peoplesoft, Salesforce, and others.

- File Storage: Structured or unstructured data stored in files within OCI Object Storage, HDFS, Hive as well as Amazon S3.

OCI-DI offers pre-built connectors that simplify the extraction process, allowing users to seamlessly connect to different data sources and retrieve the necessary data.


2. Data Transformation

Once data is extracted, it undergoes transformation to ensure it is in the correct format and structure for analysis. This stage is crucial for cleansing, enriching, and converting raw data into meaningful insights. Below are the key transformation activities in OCI-DI:

- Data Cleansing: Removing duplicates, applying filters, correcting errors, and ensuring consistency in the data.

- Data Enrichment: Adding additional information or context to the data to make it more useful for analysis.

- Data Aggregation: Summarizing data to generate metrics or insights, such as calculating totals, averages, or other aggregate values.

- Data Mapping: Converting data from one format to another, aligning it with the schema and requirements of the target system.

OCI-DI provides a rich set of transformation tools and SQL-based expressions to perform complex transformations efficiently.


3. Data Loading

The final step in the data flow process is loading the transformed data into the target destination. This could be a data warehouse, data lake, or another storage solution where data can be analyzed and used for decision-making. Below are some commonly used target destinations in OCI-DI:

- OCI Data Warehouse: For storing and analyzing structured data using Oracle Autonomous Data Warehouse (ADW).

- OCI Data Lake: For storing large volumes of structured, semi-structured, and unstructured data using OCI Data Lake.

- Any other on-prem Oracle Database.

- External Systems: Other cloud storage solutions, third-party databases, or on-premises systems.

OCI-DI ensures that data is loaded efficiently and accurately, maintaining data integrity throughout the process.


4. Scheduling and Orchestration

To ensure that data flows are executed regularly and efficiently, OCI-DI provides tools for scheduling and orchestration. This involves:

- Task Scheduling: Setting up schedules for Data Flow Integration Tasks to run at specified times or intervals, ensuring data is updated and synchronized regularly.

- Orchestration using Pipelines: Building Pipelines to coordinate multiple data flows and processes to run in a specific sequence, ensuring dependencies are managed, and workflows are executed smoothly.


5. Monitoring and Management

Effective monitoring and management are crucial for maintaining the reliability and performance of data flows. OCI-DI offers comprehensive monitoring and management capabilities, including:

- Real-Time Monitoring: Tracking the status and performance of data flows in real time, allowing users to identify and address issues promptly.

- Alerts and Notifications: Setting up alerts to notify users of any failures, delays, or anomalies in the data flows, ensuring quick resolution of issues.

- Logging and Auditing: Maintaining detailed logs and audit trails of data flow activities for troubleshooting, compliance, and governance purposes.


Main Components of OCI-DI service:

1. Data Assets:

Data Assets represent the data sources that we want to use in Data Integration data flows, tasks, and pipelines. They help us in connecting to various systems such as BICC, Fusion BI Server, Object Storage, Oracle Cloud Databases such as ADW as well as any external on-prem Oracle databases.

Data Assets for RDBMS:



Data Assets for File and Big Data:




Data Assets for Application:




Data Assets for Others:





2. Data Flows:

A data flow is a visual program representing the flow of data from source data assets, such as a database or flat file, to target data assets, such as a data lake or data warehouse.

Sample Data Flow:




3. Tasks:

A task is a design-time resource that specifies a set of actions to perform on data. Tasks can be of below types:

- Data Loader: Lets us load diverse data into data lakes and data warehouses.

- Integration: Let's us take a data integration flow design and prepare it for running.

- OCI Data Flow: Let's us run an existing application in OCI Data Flow from within OCI-DI

- Pipeline: Let's us run a Pipeline.

- SQL: Let's us run custom SQL from within a pipeline.

- REST: Let's us run a REST API endpoint in a pipeline.




Sample Integration Task:



Task Scheduling:

Once a task is published to an Application in OCI-DI, we run the task manually from the application or create a task schedule to automate the runs.

Sample Task Schedule:





4. Pipelines:

A pipeline is a design-time resource in Data Integration for connecting tasks and activities in one or more sequences or in parallel from start to finish to orchestrate data processing. We can invoke various Data Flow tasks, Data Loader tasks or even SQL and REST Tasks in a Pipeline to meet the requirements of the orchestration/ data integration.

Sample Pipeline:





Conclusion

The Extract-Transform-Load process in Oracle Cloud Infrastructure (OCI) Data Integration is designed to be efficient, scalable, and robust. OCI-DI seamlessly integrates data from various sources, transforming it into meaningful insights, and loading it into the appropriate destination. This empowers organizations to make data-driven decisions. Be it large volumes of data or complex data integration requirements, OCI-DI offers the tools and capabilities needed to streamline the data workflows and unlock the full potential of the data.


Share:

Tuesday, July 23, 2024

Implementing Geolocation Maps in Oracle APEX

Geolocation Maps have become a crucial part in many applications built in Oracle APEX. Prior to release 21.1, developers had to rely on third-party geolocation map providers and had to obtain API keys and third-party plugins to integrate Geolocation Maps in Oracle Apex.

But since release 21.1, Oracle has introduced native Maps region in Apex. It eliminates the need to obtain third-party plugins, API keys and best of all - it's free.

Map component is based on Oracle Elocation Service. More details can be found at https://maps.oracle.com. 
Map region supports Vector map layers based on OpenStreetMap Position, OpenStreetMap Bright, and OpenStreetMap DarkMatter.

Let's see how to implement this functionality in our Apex App.

- Create a new App in Oracle Apex

- Create a new page

- Create a page item to store GeoLocation details




- Create a Region and set type to Map






- Set the SQL Query in source of Map region to 


SELECT :P4_GEOLOC GEOJSON_COL FROM DUAL

Set Page Items to Submit to P4_GEOLOC




- Now, let's see how we can obtain the current location of the user

- Navigate to Dynamic Actions

- Create a DA at Page Load level



- Set Action to 'Get Current Position'



- Set Return Type to GeoJSON
- Set Item to P4_LATLONG



This will enable our App to acquire device's current location and return Geolocation in GeoJSON format and store it in item P4_LATLONG


About GeoJSON:
GeoJSON is a geospatial data interchange format based on JavaScript Object Notation (JSON).  It defines several types of JSON objects and the manner in which they are combined to represent data about geographic features, their properties, and their spatial extents. Broadly it accommodates 7 different kinds of geometry objects: "Point", "LineSegment", "Polygon", "MultiPoint", "MultiLineString", "MultiPolygon", and "GeometryCollection".

In above example, it will return the data as shown in below example:

{
    "latitude":20.7195718,
    "longitude":-103.4363326,
    "altitude":null,
    "accuracy":16.452,
    "altitudeAccuracy":null,
    "heading":null,
    "speed":null
}


BUT the Map Region in Oracle Apex requires the data in below GeoLocation format:

{
"type": "point",
"coordinates": [
Longitude,
Latitude
]
}

- So we need to Transform the data returned by Get Current Position action into this format.

- Let's create another Dynamic Action named Transform with Action as Execute Server-side code and embed below PLSQL code in it. We will be using json_table function to transform the data in the desired format.

Set Items to submit as P4_LATLONG
Set Items to Return as P4_GEOLOC

P4_GEOLOC is the item we originally used in the Source of Map region



Code:

declare
   lv_geojson varchar2(4000);
begin
 
   with device_json as (
           select :P4_LATLONG json_res 
           from dual )
 
   select '{"type": "Point", "coordinates": ['||json_data.longitude||', '||json_data.latitude||']}' coordinates
       into lv_geojson 
       from device_json,
       json_table(json_res, '$[*]'
          columns (longitude varchar2(100) path '$.longitude',
                   latitude  varchar2(100) path '$.latitude')
       ) as json_data;
 
   :P4_GEOLOC := lv_geojson;
 
end;



- Let's add another Dynamic Action to refresh the map



- Let's run the App and see what we see.

- Here, we can see the Map region shows my current location and it shows I'm somewhere in Seattle area.




- Let's zoom into the map and we'll see my exact location which is near the famous Starbucks Pike Place Market location.




- In above examples, if you notice you will see it says OpenStreetMap in the footnote of the map.

This is because I've selected OpenStreetMap Bright as the Standard in Map Region Attributes




You can experiment using all these options and select the one that suits your use case the best.

Here's the output using Oracle World Map standard





Using Latitude/Longitude instead of GeoJSON:

We saw how we can obtain the geolocation data in GeoJSON format using Get Current Position action.

There's another type supposed by this action and that is Latitude and Longitude values. 
If we want to only obtain these two and do not require other values such as Altitude, Speed etc., then we can use this method. Let's take a look into that.

- Let's create two separate page items one to store Latitude and other for Longitude and make them hidden




- Now, let's navigate back to our Dynamic Action and let's change the Return Type to Latitude and Longitude






- Now, assign P4_LAT and P4_LONG page items as below.



This will return the Latitude and Longitude values of user's current location in these page items.

- Now, let's navigate to Map Region

- Set Source SQL Query as below. This will simply use the Latitude and Longitude values stored in hidden page items.

Note: We need to convert these to NUMBER format before using it in Map Region


SELECT TO_NUMBER(:P4_LAT) P_LAT 
      ,TO_NUMBER(:P4_LONG) P_LONG
FROM DUAL




- Set Page Items to Submit to P4_LAT,P4_LONG

- Navigate to Column Mapping section under Map Layer

- Set Geometry Column Data Type to Longitude/Latitude

- Set the page items corresponding values




- Now, let's assume the user is near Museum of the City of New York while accessing the App, the Map Region will precisely show his location as below -





Share: