Category Archives: Business

A new form of on-line data protection


In the last few years I have been active with data replication solutions in the Oracle realm as you may know. This data replication field is one that has many angels, so there is something new to learn every day and sometimes there even are really new possibilities!

Take heed…

The first and most familiar form of the data replication forms is ‘physical data replication’, also known as ‘Standby Database‘.
In this form of replication, both source and target database are binary identical. Changes are propagated by copying the archived redo logfile from the source database to the environment for the standby database lives. Most often this is another server, preferably in another building in another town, far enough away to not be struck by the same havoc.

There are basically 3 ways to accomplish this;

  1. Use Oracle Data Guard (in Enterprise Edition Oracle database)
  2. Use Dbvisit Standby (in all Oracle database Editions)
  3. Write your own scripting (not recommended in any case)

The second and more emerging form of data replication is ‘Logical Data Replication’.
In this form of replication, there is not real relationship between the source and the target database, other than that the target database houses data coming from the source database. They can live on different systems, be from different database version, a different operating system or even be from a different vendor.
Data is harvested from the source database, converted and copied over to the target database / system. On the target system this data is being applied, in the native speech of the the target database.

There are a few ways to accomplish this, but basically every vendor has the same technique. It is more a matter of pricing, basically.

  1. Oracle Golden Gate (expensive, complex)
  2. Dell Shareplex (somewhat expensive)
  3. IBM Infosphere (ComPlex, expensive)
  4. Dbvisit Replicate (easy, affordable)

So, having discussed this, as this is not new, why this blogpost?

Well…

A Standby database is more or less closed. You can open it occasionally to query some data, but that interrupts the apply-process.
On-line data replication does what it says, you have an active database, where data is continuously added. This way you can, for example query, the same data on two sources to spread load.

The case I mean to discuss is the following:

“I have 10 source database and I want one target database (ah, presto, on-line data replication) and I want to backup 5 tables from each source to the target database (again, on-line data replication, but wait, backup?) so I can easily copy back specific data to the source (eeeuhm, yes…) whenever a user messes up the source tables (aï…) and I want the target to be update each day at 23:00 (so… okay!)

This reeks after somewhat of a hybrid approach!

We cannot do regular on-line data replication, for this is aimed at being real-time.
And we cannot leverage Standby database, since it needs to be centralized in one database and not 10. Next to that it would take some administration to open up the standby database in read-only mode, take the copy, and close the database again.

Working with Dbvisit, we came up with “Pause Apply” and “Resume Apply”, which we combine to form “Delayed Apply“.
This delayed apply would neatly answer the question posed.

  • By “delaying” the application of changes to the data, we could make sure the requested tables are only updated from 23:00 on;
  • We can combine the 50 tables (10 databases x 5 tables) in one single target database, since it is a logical approach to the matter;
  • We can easily restore or copy back corrupted data, since both the source and the target database remain continuously open.

Using Dbvisit Replicate, having this kind of protection for your “logical test-cases”, what this company was doing to require this solution, is really affordable.
It can help in dynamically and quickly resetting specific data-sets or test-cases while remaining much more flexible than creating scripts to reset a specific data-set or test-case! And, of course, there are many more ways to use this neat feature…


My wordpress site just disappeared

I was just thinking about re-checking something I wrote. That happens.

So I went to my blog-site… Just to find out it was gone…

GONE?

Yes, gone!

I got the message that this domain was reserved by my provider, TransIP. And that was not exactly what I was looking for.

The mystery was quickly resolved.
By checking out the ControlPanel at my TransIP account, I found I had received a message:

Dear customer,

Because of complaints of high load on our web hosting platform we have regrettably been forced to (temporary) block your website jk-consult.nl. We have done this to ensure the stability of our servers.

WHAT!!

This high load is caused by the (automatic) posting of a high number of comments on your WordPress-website. It is highly probably that this concerns unapproved comments.

In many cases this is a form of automated spam. This leads to a high load on our web hosting servers, which leads to performace problems for you and other clients on this same server.

Eeewww…

And this was followed by some comments to quiet down this spamming.

I have now installed these two plugin’s from WordPress:

https://wordpress.org/plugins/stop-spam-comments/
https://wordpress.org/plugins/spam-comments-cleaner/

And I just checked. In stead of 8000+  there were now just 2!

Another problem solved.

Thank you Arjan van den Berg (I see these are not the first kudos you’ve received)

Printing directly with APEX

When looking for a print solution with APEX you will find .PDF

You will find a lot of .PDF

And .PDF is good. There is nothing wrong with .PDF. In fact, .PDF looks cool and you can do a lot of neat stuff with it. With toolkits like pl/pdf you can create .PDF’s directly from PL/SQL.

But sometimes there is the need to be able to print directly.
For instance with batch-processing or with nightly print-runs or whatever. And this is where you would find yourself locked out with .PDF and, glancing Google, you would guess you’d be out of luck!
Since we had:

  • created a web based solution
  • the need to print directly
  • print in nightly-runs

plus we had:

  • about 400 reports (.rdf files) which we need to reuse (without having the opportunity to rebuild them in something like pl/pdf)
  • combine different output / distribution mechanisms

we needed to tackle this challenge!

So we did !!

It was fixed by using some old and new technology mixed together:

Oracle reports builder
and
Oracle Fusion Middleware, more specifically, Oracle Reports Server, aka WLS_Reports

By using this combination of products, you can create a printing solutions which is capable of printing directly to your network printer, create HTML or PDF reports.
Schedule them, e-mail them, and all this by URL-control!

http://<your-reports-server-node>:8888/reports/rwservlet?command=argument&command=argument&and-so-on

Use the following (much used, but far from a complete list of) control-commands:

  • report=<name of your .rdf>
  • userid=<userid/password@database>
  • desformat=HTML/PDF
  • destype=type of output of the report
  • desname=name of your output (device, file, whatever)

More commands in the link to the documentation on the bottom of this post!!

Notes:

  • You can post these parameters to the Reports Server without calling them in the original URL!
  • You can set a “local” on your Reports Server for omitting <@database> in ‘userid’ for your default database
  • Actually you can set all environment variables, like TNS_ADMIN, NLS_LANG, REPORTS_PATH, etc.

What we found is we needed to run Oracle Reports Server on Windows, just to take advantage of the Windows Printing System which is quite stable and easy to configure. (So, yes, okay, there you have it, a good thing about Windoze!)

Basically you can create a simple solution, but you can easily expand it quite a bit, making a printing and reporting solutions worthy of and enterprise environment, with distributing reports via e-mail, creating reports in file-systems, embedding reports in websites, and basically anything you want or would need.

And, you get a nice Management Console for free with this installation!

08-forms-em
Oracle Enterprise Manager Console

From this management console you can administer your print-jobs, set all kinds of parameters, which is quite neat!!

But, wait… the catch… It’s gonna cost you!

Or, can you keep it under control?

But of course!

Printing is mostly a half-on-line thing, and for a lot of stuff, it’s not extremely performance / time critical… So what can we do?

Oracle Reports Server is licensed as “Oracle Forms & Reports Server” and it will set you back € 370 per Named User or € 18.200 per CPU (being Oracle CPU’s according to the Core Factor Table!)
It’s still a whole lot of money, but would you really need more than 2 cores? If you give the machine enough memory and fast disks? Probably not.

Is it worth considering taking another node in your environment? Perhaps. This print-solutions could be a viable reason to do so. It brings you quite a bit of functionality straight from the box. But, as always, do your math and make educated choices.

The documentation link promised:
https://docs.oracle.com/cd/E16764_01/bi.1111/b32121/toc.htm

If you would like more info, please just drop me a line!

TCL, Total Cost of Loss, a new business perspective

‘Total cost of Loss’ (TCL) was launched at the World Premiere of the Standard Edition Round Table during the OUGF Harmony 2014 annual user conference.

Doing nothing does not mean it costs nothing

Joel J. Goodman, Finland 2014

“TCL.” Abbreviations.com. STANDS4 LLC, 2014. Web. 15 Jun 2014. <http://www.abbreviations.com/term/1519392>.

Total Cost of Loss is the representation of the cost for an organization when data is lost. Experience learns that this is the hardest exercise in business continuity to figure out and the most neglected threat to an organization.

Next to the two best known terms RTO & RPO and the less well known term RTDA (‘Recover Time to Data Availability’), TCL is aimed at providing the business with an extra ratio to conduct BCP.

To correctly evaluate investments that have to be done to create a sufficient RTO time frame or RPO granularity, there has to be an understanding of the magnitude of the (financial) importance of the underlaying (data)system. TCL is aimed at calculating this figure where this figure is valid per specific data system.

The following components have currently been identified as being part of TCL:

  1. Collection price per granule of data*
  2. Present value per granule of data
  3. Business value per granule of data
  4. Added value in a dataset combination

* a granule of data is the smallest possible set of variables comprising a usable piece of information.

1. Collection price per granule of data:
The amount of effort (time, computing power, etc.) which is required to assemble and record the granule of data in the data-structure.

For example: 1) the time it takes to pick up an item and scan it’s bar-code with a bar-code scanner and put the item back, or 2) the time it takes to enter somebodies name and address at admittance inclusive of possible preparation and filing.

2. Present value per granule of data:
The current amount of effort (if possible) which is required to reassemble and record the granule of data in the datastructure. This entity is taking into account that historical data could be easy to collect at the historic point in time (#1) but would take an unequal effort to collect at present.

For example: 1) establishing if the item was on stock at the given moment, what it’s bar-code would have read at that time and possibly who scanned it at what location, or 2) finding out what person came to be admitted at that specific date and retracing what the date would have been that was entered at that specific moment and possibly by whom.

3. Business value per granule of data:
The value of the single entity of data for the operational business after the moment of measurement. During data lifetime, the value of a specific granule of data can change. Most often it will become less valuable, making it possible to archive or even cumulate** the data in multi teer storage solutions, but, when called upon, it could be this specific granule of data could be of vital importance!

For example: 1) knowing how many of a specific item is in stock, or 2) having identified a specific person within the clientgroup.

4. Added value in a dataset combination:
It can very well be and most probably is, that any granule of data is of key importance to a dataset combination, where several bits of data of different datasets of data-systems combined create information which is vital to any specific action within an organization.

For example: 1) knowing how many of a specific item is in stock to support a JIT-delivery system to keep a production line uninterruptedly going, or 2) delivering the right treatment to any specific person and being able to bill them accordingly.

** Cumulation of data can destroy a recovery path for retrieving any specific granule of data.

Creating a formula to calculate any TCL will be relatively easy.

Creating a model to extract or calculate or even guesstimate the values for the different variables of the formula will be the challenge.
A challenge that needs to be met because of the ever increasing volume of data and the ever increasing importance of certain realms, like healthcare, public services, transportation, etc., within this data mass.

Please step on board and help define TCL as it could prove to be a critical factor when push comes to shove!

My introduction to Synology DiskStation DS214

DataGrowth
Data growth visualization

In these times no household can be without ‘Home IT’ completely with a ‘home data center’. The figures for business data growth are at least equally applicable to the home situation, al least!

As in probably many situations, we started out with just some loose PC’s, which could be hooked up to The Internet, and after some time, also could be hooked up to one another. Gradually a fileserver was introduced because files were always on some other computer than you would need them. This made for a nice status quo, until the introduction of tablet computers and what not…

Now it was no longer so easy to access files or do whatever you wanted or needed, space was becoming scarcer by the day. Plus, our trusted old fileserver was starting to decay. I need to intervene. With a ‘data center’ just existing of a router a WiFi Accespoint, a central printer, some old and some newer workstations and tablets and this trusted old fileserver, an exchange wouldn’t be a daunting task…

A few years back I came across a company called Synology. I met some product guru of them and somehow registered the product name. Lately some of my colleagues started working with Synology in their home IT, so I decided this would be the way to go!

As I was running a fileserver with just a 100 Gigs of storage it was not hard to findBarracuda something with a bit more capacity. And the decision finally wasn’t too hard indeed.

A Synology DS (DiskStation) 214play it had to be and I decided upon 2 hard disks of 4 TB (Seagate Barracuda ST4000DM000) in a S.M.A.R.T. mirror setup.

Okay, so decision made, product purchased, product setup (which was just made too easy by all the discovery tooling). Disks setup in RAID 1 and on with the show!

And now the discovery fun begins…

DiskInsertsIt appears Synology has made 9 apps (for iPad / iPhone) that do all kinds of things!

  • DS Cloud, for your personal cloud solution
  • DS Mobile, monitor and manage download tasks on your DS
  • DS Finder, find Disk Stations and manage them (even remotely)
  • DS Photo, manage and view your photo collection
  • DS Cam, turn your DS into a video surveillance server
  • DS Download, to download anything you want (but mostly movies of course)
  • DS File, find files and anything else
  • DS Video, stream video anywhere
  • DS Audio, stream audio anywhere

So, okay, this is a bonus already! Just place files in the correct folder and your in business!

Currently I am migrating data towards the DS having first enabled some of the many apps inside the station, like:

  • Anti Virus
  • Java
  • iTunes Server
  • Mail Server (need to look into this, with many accounts with as many providers mail in a menace!)

Okay, well, these are just a few, but I fear there are MANY more apps out there!

Actually, from opening the box until being in business it took me 2 hours. Having a first run of customization down took me another 2 hours.

I guess doing the rest of the stuff

  • Hooking onto CrashPlan and possible MindTime
  • Getting VPN set up
  • Being able to access the DS over The Internet
  • Figuring out automatic movie download
  • Getting iTunes Server hooked up and running

will take me some hours still, but hey, this is more or less classifiable as hobby or at least additional functionality!

Some other things I heard / saw a DS was used for:DS214Play

  • Content Management server
  • Database server
  • Directory server
  • DNS server
  • ERP system
  • Forum host
  • GIT server
  • Online shop
  • Python
  • Website host
  • Wiki host
  • WordPress host

So, in the end, all is good now, we got plenty of space, a lot of new functionality. I would say, ‘Yay Synology’!

Oracle in perspective

A brief overview of alternatives…

This document focuses on the perception of the Oracle database related to ‘Small and Medium businesses’, European Style.
First we will take a quick look at Enterprise licensing and give a ballpark idea of prizes en possibilities. Next I will put this in perspective with more detail and will highlight possibilities to get ‘high end results’ with what is branded as ‘entry level’ investments. Everywhere I say Oracle, I mean the Oracle database.

Oracle is investment intensive
Oracle Enterprise Edition licenses are price-listed for over € 35.000 per processor. These CPU’s actually are not ‘real CPU’S’ but units which are defined according to Oracle’s Core Factor Table.
An Oracle Enterprise Edition license allows you to a) install and use the Oracle Enterprise Edition software and b) buy additional tooling to complete the Enterprise software stack. In this setting there is Oracle Active Data Guard, Oracle Database Vault, Partitioning, etc. to consider.
With Oracle Enterprise Edition it is possible to create a high performance, high available and ‘disaster resistant’ environment. Where it needs to be remarked that this program-set comes with an according price tag.

Oracle Standard Edition environment
A special exception in the Oracle license politics is the Oracle Standard Edition database. This installation uses the exact same database-software (binary compatible) as the Enterprise Edition edition but comprises a significantly reduced set of features and options that can be found in this global overview. The most important question is if these features and options are really needed to realize a high performance, high available and ‘disaster resistant’ environment.
Let’s first quickly zoom into a practical example the indicate an investment-perspective.
Based on a HP Proliant DL380 Gen8 E5-2690v2 Server with 2 processors with each 10 cores.

— Oracle Enterprise Edition:
2 x 10 cores x 0,5 core factor = 10 licenses x € 37,492 = € 374,920 excluding maintenance.
— Oracle Standard Edition:
2 x 1 processor = 2 licenses x € 13,813 = € 27,626 excluding maintenance.
— Oracle Standard Edition One:
2 x 1 processor = 2 licenses x € 4,578 = € 9,156 excluding maintenance.

In this setting we can save up to € 365,764 by leveraging Standard Edition. The reason is that the Standard Edition software is significantly cheaper but mainly because of the fact that the Standard Edition software is licensed per processor socket in stead of by the units defined by the ‘Core Factor Table’!
The limitation is that Standard Edition has a limit of 4 sockets per server and Standard Edition One is limited to 2 sockets per server. This is an important fact!

Room for investment
In our example it is possible to decide in favor of Standard Edition One. What we can subsequently deduce is that we have a theoretical budget of about € 350,000 available to make sure we have a sufficient high performance, high available and ‘disaster resistant’ installation. Even if we were to consume all of this budget, which is not very likely, the return on this investment remains high because the year-by-year support-cost for this environment is ((10 x € 8,248.19) -/- (2 x € 1,007.15)) € 79,467.60 per year cheaper.
In this calculation possible discounts have not been included. Looking at the volume of the investment differences any discounts will have to terminating influence. The year-by-year support-cost will remain based on the original price of the software.

Virtualization
One of the most significant hurdles with leveraging the Oracle software is virtualization, where technical considerations are not the toughest to deal with; the license consequences are!
As we concluded, Oracle Standard Edition is applicable on max 4 processors. In case of virtualization, it is true that all processors of all hardware, where the Oracle database can migrate to, either automatically or with live migration.
With this rule it is nearly impossible to leverage Standard Edition licenses and will is it be nearly impossible to use virtualization in a ‘small to medium business’ setting… Unless a smart alternative is chosen.

Alternatives
1. The abstraction layer
By leveraging virtualization-software as a abstraction layer, a server installation can be separated from the physical hardware configuration on which it runs. By using this alternative it is possible to recover from hardware failure more efficiently.
2. 2 x 2 sockets
By using a limited virtualization-cluster of 2 nodes with 2 sockets each having the maximum possible number of processor cores, the complete advantage of virtualization can be created using the maximum advantage of Standard Edition. Please note that we would need a Standard Edition license. Alternatively you could create a cluster with 2 x 1 socket to facilitate the usage of a Standard Edition One license.
3. ESL
In the case software from a third party is used, this software development party can agree on using a Embedded Software License; from Oracle. This form of licensing is quite specific and is therefor not further discussed here.
4. What will virtualization not solve
Virtualization is not replacement for Backup and it is no alternative for disaster proofing an Oracle database. These specific tasks are resolved by using backup of standby database tooling.

Tooling
In the beginning of this article it is indicated that the Oracle Enterprise Edition software give you the right to buy additional tooling to complete the Enterprise Software installation.
Alternatives for this tooling are also available for Standard Edition installations. Please consider:

  • Dbvisit as an alternative for Oracle Data Guard or Oracle Golden Gate
  • OraSash as an alternative for Oracle Active Session History
  • Nagios or SPS GenSys as alternatives to Oracle Enterprise Manager

Conclusion
Based on the information above we can conclude there are good possibilities to leverage the Oracle Database in a ‘Small and Medium Business’ environment. The information above is no complete and ultimate description of all possibilities, but this quick overview gives enough to work with to zoom into any specific challenge.

Oracle XML DB content easily moved

We have this application where we just store some specific content in an XDB-schema.

After a quick move of the Oracle database from a legacy system to a new environment we found that the XDB-schema and it’s contents were not moved. Okay, this is what happens when you use “good-ole” imp/exp instead of some “newer” technologies like RMAN or expdp.

What now? We can start the entire move again (but that would mean downtime for recreating the database, amongst others) or we could do a specific move for the XDB-schema (but meanwhile new content was being added to the system already). Actually all of these are not the nicest scenario’s and seemingly adding too much complexity. Not what we want…

What about a a smart alternative here too! We could simply use ftp after all.

From the EPG (Embedded PL/SQL Gateway) functionality of the database, we can just enable ftp through the Oracle database listener. With this functionality we can access the application database on the legacy system through ftp and easily copy the content to a local directory, especially since there is just a few hundered Megabytes of data.

Enable this access by:
execute dbms_xdb.setFtpPort(2100);

With a tool like Filezilla, the contents were copied to a local directory.

After the action is complete, ftp-access to the database is closed, you can never be too careful!
execute dbms_xdb.setFtpPort(0);

Loading this contents in the new location is a repetition of the actions. Enable the EPG-ftp-port on the new database, use an ftp-tool to upload the data and don’t forget to disable the EPG-ftp-port afterwards.

One tricky thing is that you should mind the data-ownership. This is easiest done by connecting to the ftp-account with the same user that owned the database in the source database!

When you run into errors, probably there could be something wrong with your XDB-installation. Please look at this post for some more on that!!

Okay, and now my database server crashed…

RTO/RPO, who has ever heard of that! That was Star Wars, right?
Storing data and never having to go without or losing any… Yes, that’s more like it.

Server Crashed

Okay, and these two have everything to do with each other!

Talking about these two fancy IT abbreviations I have raised many eyebrows and aided securing businesses!

What is it:
RTO: Recovery Time Objective, or rather, how long should it take before your database is up-and-running again!
RPO: Recovery Point Objective. How much data can you stand to lose?

It is customary to put real amounts of time for these both parameters. This is one of these true points where IT ‘meets’ business, one of those do or die SLA parameters.
How long before you can start working again after something has gone somewhat horribly wrong? Dependent on the business (and for sake of argument), you will get something like; “Oh well, if we are back in business in say an hour, I guess we’ll be fine.” Okay, so we have RTO = 1hr.
And, how much data can you afford to lose? “Losing data, what do you mean?” Well, let’s say you have been on the phone and in the field harvesting order data and putting this in the database… how much of this information can be reproduced when your environment fails? We’ll go with two scenario’s. We will presume “Oh no, NOTHING!” and “Hmmm, well, 10 minutes, if needs be!”, making respectively RPO = 0min. and RPO=10min.

  • RTO = 1 hour
  • RPO = 0 minutes or 10 minutes.

Let us investigate what this means, assuming we have a functional backup running every night and that our drama happens at 15:45 on a working day.

What do we have when we do nothing?
After establishing we have a system crash at hands we need to start working immediately to rebuild something, but do we have something to build upon?
Do we have hardware? And does it somewhat meet specs? Can we run our OS (version) on it? Do we have OS media to install with? Do we have Oracle media to install with? Can we get network, and so on…
And if we have this do we have enough expertise to get it installed?
Well, I guess it’s clear… We need to invest big-time! Few hours getting all the facts straight and getting hardware, a few hours to install and configure the OS, a few more for Oracle, getting it to resemble the former production environment and then restoring the backup!
RTO = starting at 8 hours.
Looking at our RPO? Well, okay, that’s easy! We backup at midnight (0:00) and we crash at 15:45. So we will have lost 15 hours and 3 quarters.
RPO = 15:45 hours.
Acceptable? No, not really!

It’s clear we have to do something.
The first step is to reduce RTO, we need to be able to continue work faster.
We can do this by making sure we have a second server standing by in a different location. Have it installed, have it configured and ready to jump into action. You could call this a Standby Server.
But even now there is no guarantee we make our target since restoring a backup and getting the database up and running could still easily take over 1 hour, when dealing with red-tape and decision levels. To hit the home run we need to add one more feature, we need to have not only a Standby Server, we also need to have a Standby Database. A database that can be “opened” or “activated” in mere minutes.

  • Are you running Enterprise Edition Database then you can use Oracle Data Guard, included in your database license.
  • Are you running Standard Edition Database then you can get the Smart Alternative from Dbvisit.

With Standby Database in place:
RTO = 5 minutes!!

Now we need to tackle RPO!
Or… do we still?
RPO = 10 minutes, actually is tackled by the Standby Database implementation.
Because of the characteristics of Standby Database, we do not only have an RTO of mere minutes, we also have an RPO of a configurable duration.
Data is transferred to the Standby Database environment by means of archived Redo Log files and this mechanism is influenced by manual switching of log files and if you do this with small enough intervals (less than our target of 10 minutes) we make sure that age of the data in the Standby Database meets the target “Recovery Point Objective”!
RPO = 0 minutes
Well, okay, this is something else. And if we think about this a little, it’s something completely different!
Recovery Point Objective, the amount of data we can stand to lose, is 0 (nothing!). Actually meaning we have to create a Standby database setup which is kept up to date with the primary environment. This kind of Standby Database environment allows you to switch to this second environment within seconds and continue your business operation without delay!

And, with your Active-Active Standby Database solutions in place:
RPO = 0 minutes!

So, now you know about RTO/RPO to secure your data and know this guy is something else.

r2-d2

Increasing the reach of your SE database license

Imagine the following situation…

Since a few years your business has been investing in centralizing valuable business information. After some research in the market you have found the Oracle database to be the best fit for your requirements.
Using the free Oracle Application Express (APEX) framework, helping you to rapidly develop the web-applications needed to support both internal and external users, was a premium. Making this installation available based on the Oracle Standard Edition One database, you have created this solution against the lowest possible investment!

As many great projects go, the use and the number of APEX applications is growing. With the addition of ready to use applications to inspire you, many cool plug-ins to ever increase the usability and integration possibilities you get caught up in the data growth dogma!
With an ever increasing user population and expansion of data-reporting for ever faster business reporting your initial system is starting to fail, showing ever more frequent performance lags or system unavailability. These problems form a risk for your business, a risk you need to eliminate as soon as possible!
The standard advise here would be to upgrade your environment, the standard advise here would be to upgrade to a bigger machine and to an Enterprise Edition database. This is what your investment would be then…

  • Medium Oracle Sun Server X2-4 with 4 x 10 core CPU’s at € 42,500
  • (40 cores x 0,5 core-factor **) 20 Oracle Database Enterprise Edition licenses               at € 914,800

Without rendering your application infrastructure worthless by the required investment, a more reasonable step would be to migrate to Oracle Database Standard Edition.

  • Medium Oracle Sun Server X2-4 with 4 x 10 core CPU’s at € 42,500
  • 4 Oracle Database Standard Edition licenses at € 67,400

Still requiring a total investment of more than a hundred thousand Euro and leaving you with the old server and licenses to be decommissioned.

In many implementations, not data entry but data-mining or information aggregation are the costly processes. So probably this will be true in this situation too. With a little investigation it is possible to separate a number of functions that will only query data and not necessarily modify data. Especially in this situation you can also increase your application performance by moving these specific processes to a new environment.

But… how…

The information in the new environment needs to be real-time consistent with the “production” or primary environment. Here we introduce a real-time data replication solution like Dbvisit Replicate which will create just this real-time consistent query environment for you! This makes for the following investment:

  • Medium Oracle Sun Server X4-2 with 2 x 8 core CPU’s at € 19,500
  • 2 Oracle Database Standard Edition One licenses at € 11,200
  • 4 Dbvisit Replicate XTD at € 16,180

With this installation you add another € 50 k. of licensing in stead of € 100 k. with the Standard Edition migration. With this choice, you separate your time-critical data-entry process from the query environment, making sure a mis-fired query will not influence the availability of your data-entry process environment, which is a cool extra advantage!

* All prices are based on list-prices, excluding VAT and including 1 year of support.
** Based on the Oracle Processor Core Factor Table.