Feed aggregator

Helvetia used AWS SCT & DMS to migrate to AWS RDS for PostgreSQL

Yann Neuhaus - Thu, 2022-05-19 04:32

One of our long term-time customers, Helvetia, successfully migrated on-prem Oracle databases to AWS, not only because of the licenses, but more importantly: to deploy faster, innovate faster, and use the state-of-the-art open source database system.

When you plan such a project, you need to know which tools you want to use and what the target architecture shall look like. There are several options to choose from but finally Helvetia decided to use the AWS native services AWS DMS and AWS RDS for PostgreSQL.

AWS DMS gives you the option to initially populate the target instance from the source, and right afterwards logically replicates ongoing changes from the source to the target. However, before you can do that, you need the schema to be ready in the target. To prepare this, there is AWS SCT. This is not an AWS service, but a free tool you can use to convert a schema from a database system to another. If you want to go from Oracle to PostgreSQL, this tool also performs an automatic conversion from Oracle’s PL/SQL to PostgreSQL’s PL/pgSQL. Although this tool does a great job, you have to be very careful with the result, and invest a good amount of time in testing. Autonomous transactions, for example, do not exist in PostgreSQL and the AWS schema conversion utility implements a workaround using database links. This can be fine if you rarely use it (because it needs to establish a new connection) but if you rely heavily on this feature, you’d better re-implement in a way that is native to PostgreSQL.

Another area you need to pay attention to are the data types. PostgreSQL comes with many of them. A NUMBER in Oracle can mean many things in PostgreSQL. It could be an integer or a numeric in PostgreSQL. Depending on what you go for, this comes with space and performance impacts in PostgreSQL. PostgreSQL comes with a boolean data type. In Oracle, this is usually implemented as a character or a numeric value. Do you want to keep it that way or do you want to convert to a boolean? Converting means that you also need to adjust the business logic in the database.

Another issue that took quite some to solve was this. The very simplified test case attached to the initial Email showed massive performance drops in PostgreSQL compared to Oracle. The reason is that Oracle’s PL/SQL is a compiled language and PostgreSQL’s PL/pgSQL is interpreted. If you have a case that more or less matches what is described in the thread linked above, you need to re-write this. The same applies when you have commits or rollbacks in PL/SQL functions. PostgreSQL does not allow you to commit or rollback in a function. You need to use procedures for that.

These are just a few hints of what might come along the way when migrating to AWS RDS for PostgreSQL. Once you have solved all this, the migration can be really smooth and will most probably be a success. Here are some posts that describe how to set this up using an Oracle sample schema as the source:

If you follow that, you should have enough knowledge to get started with your journey to AWS RDS.

Cet article Helvetia used AWS SCT & DMS to migrate to AWS RDS for PostgreSQL est apparu en premier sur Blog dbi services.

Announcement: Dates Confirmed For Upcoming Webinars (“Here Today, Gone Tomorrow”)

Richard Foote - Thu, 2022-05-19 01:53
As promised last week, I have now finalised the dates for my upcoming webinars. They will be run as follows: “Oracle Indexing Internals“ Webinar: 18-22 July 2022 (between 09:00 GMT and 13:00 GMT daily) “Oracle Performance Diagnostics and Tuning“ Webinar: 8-11 August 2022 (between 09:00 GMT and 13:00 GMT daily) I’ll detail costings and how […]
Categories: DBA Blogs


Marian Crkon - Wed, 2022-05-18 00:14


The Feature - Wed, 2022-05-18 00:14




Categories: APPS Blogs

Maximum number of Autonomous Databases on an ExaCC Full Rack

Tom Kyte - Tue, 2022-05-17 12:26
How many autonomous db we could have for ExaCC full Rac?
Categories: DBA Blogs

Number of cores before requring RAC licencing for BYOL

Tom Kyte - Tue, 2022-05-17 12:26
The maximum number of cores before needing RAC licensing is per Cluster or for the sum of the Clusters?
Categories: DBA Blogs

Automatic Indexes: Automatically Rebuild Unusable Indexes Part III (“Waiting For The Man”)

Richard Foote - Tue, 2022-05-17 01:43
I’ve previously discussed how Automatic Indexing (AI) will not only create missing indexes, but will also rebuild unusable indexes, be it a Global or Local index. However, all my previous examples have been with Automatic Indexes. How does AI handle unusable indexes in which the indexes were manually created? In my first demo, I’ll start […]
Categories: DBA Blogs

Connecting to MySQL Database Service (MDS) via DBeaver

DBASolved - Mon, 2022-05-16 10:37

With every new service on any cloud platform, the need to make connections is essential .This is the case with […]

The post Connecting to MySQL Database Service (MDS) via DBeaver appeared first on DBASolved.

Categories: DBA Blogs

A quick way of generating Informatica PowerCenter Mappings from a template

Rittman Mead Consulting - Mon, 2022-05-16 04:52
Generating Informatica PowerCenter Content - the Options

In our blogs we have discussed the options for Oracle Data Integrator (ODI) content generation here and here. Our go-to method is to use the ODI Java SDK, which allows querying, manipulating and generating new ODI content.

Can we do the same with Informatica PowerCenter? In the older PC versions there was the Design API that enabled browsing the repository and creating new content. However, I have never used it. My impression is that Oracle APIs are more accessible than Informatica APIs in terms of documentation, help available online and availability for download and tryout.
If we want to browse the PowerCenter repository content, there is an easy way - query the repository database. But what about content generation? Who will be brave or foolish enough to insert records directly into a repository database!? Fortunately, there is a way, and a fairly easy one, if you don't mind doing a bit of Python scripting.

Generate PowerCenter Mappings - an Overview

Selective Informatica PC repository migrations are done via XML export and import - it is easy and mostly fool-proof. If we can generate XMLs for import, then we have found a way of auto-generating PowerCenter content. Informatica seems to support this approach by giving us nice, descriptive error messages if something is wrong with import XMLs. Only completely valid XMLs will import successfully. I have never managed to corrupt my Informatica repository with a dodgy XML import.

Let us look at an example - we need to extract a large number of OLTP tables to a Staging schema. The source and staging tables have very similar structures, except the staging tables have MD5 codes based on all non-key source fields to simplify change data capture (CDC) and also have the extract datetime.

  1. We start by creating a single mapping in Designer, test it, make sure we are 100% happy with it before proceeding further;
  2. We export the mapping in XML format and in the XML file we replace anything unique to the source and target table and their fields with placeholder tags: [[EXAMPLE_TAG]]. (See the XML template example further down.)
  3. Before we generate XMLs for all needed mappings, we need to import Source and Target table definitions from the databases. (We could, if we wanted, generate Source and Target XMLs ourselves but PC Designer allows us to import tables in bulk, which is quicker and easer than generating the XMLs.)
  4. We export all Sources into a single XML file, e.g. sources.xml. Same with all the Targets - they go into targets.xml. (You can select multiple objects and export in a single XML in Repository Manager.) The Source XML file will serve as a driver for our Mapping generation - all Source tables in the sources.xml file will have a Mapping generated for them.
  5. We run a script that iterates through all source tables in the source XML, looks up its target in the targets XML and generates a mapping XML. (See the Python script example further down.) Note that both the Source and Target XML become part of the Mapping XML.
  6. We import the mapping XMLs. If we import manually via the Designer, we still save time in comparison to implementing the mappings in Designer one by one. But we can script the imports, thus getting both the generation and import done in minutes, by creating an XML Control File as described here.
Scripting Informatica PowerCenter Mapping generation

A further improvement to the above would be reusable Session generation. We can generate Sessions in the very same manner as we generate Mappings.

The Implementation

An example XML template for a simple Source-to-Staging mapping that includes Source, Source Qualifier, Expression and Target:

<?xml version="1.0" encoding="UTF-8"?>
<FOLDER NAME="Extract" GROUP="" OWNER="Developer" SHARED="NOTSHARED" DESCRIPTION="" PERMISSIONS="rwx---r--" UUID="55321111-2222-4929-9fdc-bd0dfw245cd3">

            <TABLEATTRIBUTE NAME ="Sql Query" VALUE =""/>
            <TABLEATTRIBUTE NAME ="User Defined Join" VALUE =""/>
            <TABLEATTRIBUTE NAME ="Source Filter" VALUE =""/>
            <TABLEATTRIBUTE NAME ="Number Of Sorted Ports" VALUE ="0"/>
            <TABLEATTRIBUTE NAME ="Tracing Level" VALUE ="Normal"/>
            <TABLEATTRIBUTE NAME ="Select Distinct" VALUE ="NO"/>
            <TABLEATTRIBUTE NAME ="Is Partitionable" VALUE ="NO"/>
            <TABLEATTRIBUTE NAME ="Pre SQL" VALUE =""/>
            <TABLEATTRIBUTE NAME ="Post SQL" VALUE =""/>
            <TABLEATTRIBUTE NAME ="Output is deterministic" VALUE ="NO"/>
            <TABLEATTRIBUTE NAME ="Output is repeatable" VALUE ="Never"/>
            <TABLEATTRIBUTE NAME ="Tracing Level" VALUE ="Normal"/>








Python script snippets for generating Mapping XMLs based on the above template:

  1. To translate database types to Informatica data types:
mapDataTypeDict = {
	"nvarchar": "nstring",
	"date": "date/time",
	"timestamp": "date/time",
	"number": "decimal",
	"bit": "nstring"

2. Set up a dictionary of tags:

xmlReplacer = {
	"[[SOURCE]]": "",
	"[[TARGET]]": "",
	"[[MAPPING_NAME]]": "",
	"[[MD5_EXPRESSION]]": "",
	"[[SRC_2_SQ_CONNECTORS]]": "",
	"[[SQ_2_EXP_CONNECTORS]]": "",

3. We use the Source tables we extracted in a single XML file as our driver for Mapping creation:

sourceXmlFilePath = '.\\sources.xml'

# go down the XML tree to individual Sources
sourceTree = ET.parse(sourceXmlFilePath)
sourcePowerMart = sourceTree.getroot()
sourceRepository = list(sourcePowerMart)[0]
sourceFolder = list(sourceRepository)[0]

for xmlSource in sourceFolder:
	# generate a Mapping for each Source
    # We also need to go down the Field level:    
    for sourceField in xmlSource:
    	# field level operations

4. Generate tag values. This particular example is of a Column-level tag, a column connector between Source Qualifier and Expression:


5. We assign our tag values to the tag dictionary entries:

xmlReplacer["[[SQ_2_EXP_CONNECTORS]]"] = '\n'.join(sqToExpConnectors)

6. We replace the tags in the XML Template with the values from the dictionary:

for replaceTag in xmlReplacer.keys():
	mappingXml = mappingXml.replace(replaceTag, xmlReplacer[replaceTag])

Interested in finding out more about our approach to generating Informatica content, contact us.

Categories: BI & Warehousing

Data Annotation with SVG and JavaScript

Andrejus Baranovski - Mon, 2022-05-16 01:35
I explain how to build a simple data annotation tool with SVG and JavaScript in HTML page. The sample code renders two boxes in SVG on top of the receipt image. You will learn how to select and switch between annotation boxes. Enjoy!


Maximum number of concurrent sessions in multi instance database

Tom Kyte - Sun, 2022-05-15 23:46
Hi, We have Oracle 12C on 2 instances. I know GV$license can give maximum number of concurrent sessions since start of instances. But is there a way to get maximum we had accessing the database from both together ? Syed
Categories: DBA Blogs

Index on XMLTYPE with XPATH Expression including a XPATH Function

Tom Kyte - Sun, 2022-05-15 23:46
Is there a way to create a index for a xpath that is including a xpath function? Please consider that xmltype index creation fails at oracle livesql.
Categories: DBA Blogs

Cannot Upload git-upload-pack error while cloning Azure Git Repository

Tom Kyte - Sun, 2022-05-15 23:46
Hi, <i>Background and Requirement</i> - I am working for a firm that uses <b>Oracle SQL Developer</b> for Data Cleaning and Manipulation of the data residing in the Oracle Database. We use <b>Microsoft Azure</b> for complete lifecycle management and work planning. So, we decided to use an <b>Azure-hosted cloud Git Repository</b> to host our code remotely and leverage its version control capabilities. We have a Git repository on Azure and are trying to clone the same in Oracle SQL Developer. <i>Steps followed to fulfill the requirement</i> - The following steps were followed for cloning the existing remote repository in Oracle SQL Developer. 1. Go to Teams Menu. 2. Hover over Git. 3. Select Clone option. 4. After the Clone from Git wizard opens up, entered the correct Repository URL, Username and password. 5. We work on a VPN so, I have set the corresponding proxy settings too. When testing the proxy, it gives a success message. (So, no issue in the proxy settings) 6. Click next to fetch remote repository branches. An error appears at this stage. <i>Error that occured</i> - A popup with the title <b>Validation failed</b> and the content as https://<remote repo url>/_git/<remote repo name>:cannot open git-upload-pack appears. <i>Troubleshooting Methods Tried</i> - The following troubleshooting methods have been tried. 1. A lot of troubleshooting methods online suggested that the Local git config has sslVerify set to false could help. Did that, no gain. 2. Tried cloning my personal git repository to test the working of the Git integration on Oracle SQL Developer. It was able to successfully fetch the remote branches. Hence, the error is coming up only while cloning an Azure Repository. 3. Looked at almost all the solution links online, but most of them were for Eclipse. Since both Eclipse and SQL Developer are Java-based applications, I tried doing those resolutions but most of them are regarding SSL Verify setting to false. At the end I have raised the issue here. Hoping to find some help here. Thanks in advance.
Categories: DBA Blogs


Marian Crkon - Sun, 2022-05-15 00:14


The Feature - Sun, 2022-05-15 00:14




Categories: APPS Blogs

FORCE_LOGGING in Autonomous Database

Tom Kyte - Fri, 2022-05-13 16:46
Is FORCE_LOGGING enabled at CDB level in ADB-S? I checked that FORCE_LOGGING was not enabled at the PDB level and the Tablespace level.
Categories: DBA Blogs

Find Circular References in UDTs

Tom Kyte - Fri, 2022-05-13 16:46
The latest Oracle docs has the following design tip: Circular Dependencies Among Types Avoid creating circular dependencies among types. In other words, do not create situations in which a method of type T returns a type T1, which has a method that returns a type T. https://docs.oracle.com/en/database/oracle/oracle-database/21/adobj/design-consideration-tips-and-techniques.html Attached is a link to LiveSQL that exhibits a very simple circular dependency that will likely have issues recompiling during a datapump. Assuming we already have a large application that the compiler is having issues with is there a query we can use to find instances where T1 references T2 and T2 references T1? We would also need to find them a few generations apart (T1 references T2, T2 references T3, T3 references T1). The reference may be either in an attribute (REF) or a subprogram (parameter or return type). This would allow us to find what types may need to be changed to be brought in line with the latest documentation. Thanks in advance for your help.
Categories: DBA Blogs

Select XMLQuery XML parsing error with ampersands

Tom Kyte - Fri, 2022-05-13 16:46
Hi Tom and Team, I guess that this issue is related to the namespace, but as I don't know well this, Could you help me to solve the error running this Select, please? <code>with testTable as ( select xmltype ('<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"> <soap:Body> <ns5:MT_Consulta_pedidos_pagamento xmlns:ns2="urn:Cpy.com/Model/ConsultaPedidosDevolucao/v0" xmlns:ns3="urn:Cpy.com/Model/AtualizaStatusPagamento/v0" xmlns:ns4="urn:Cpy.com/Model/AtualizaItensDevolvidosCancelados/v0" xmlns:ns5="urn:Cpy.com/Model/ConsultaPedidosPagamento/v0"> <codigo_empresa>&Empresa</codigo_empresa> <numero_pedido_venda>&Pedido</numero_pedido_venda> <codigo_loja>&Loja</codigo_loja> <numero_componente>&Componente</numero_componente> </ns5:MT_Consulta_pedidos_pagamento> </soap:Body> </soap:Envelope>' ) xml_val from dual ) select xmlquery('/soap' passing xml_val returning content) as dados from testTable;</code>
Categories: DBA Blogs

Patch Oracle GoldenGate Microservices using RESTful APIs

DBASolved - Fri, 2022-05-13 08:10

In 2017, Oracle introduced the world to Oracle GoldenGate Microservices through the release of Oracle GoldenGate 12c ( Upon the […]

The post Patch Oracle GoldenGate Microservices using RESTful APIs appeared first on DBASolved.

Categories: DBA Blogs

How can we execute a SQL script file in SQL trigger and output of this SQL script execution into the log file?

Tom Kyte - Thu, 2022-05-12 22:26
How can we execute a SQL script file in SQL trigger and output of this SQL script execution into the log file? We are automating one of the SQL script file execution. We want to execute this SQL script file once the data will insert into the table and we want the SQL script file execution in the trigger. Regards, Abhishek Bhargava
Categories: DBA Blogs


Subscribe to Oracle FAQ aggregator