Tag Archives: Business

Why I picked Postgres over Oracle, part III


This is the final episode of this short series of blog posts on some of my drivers for moving to Postgres from Oracle.
Please do read Part I and Part II of the series if you have not done so. It discussed the topics “History”, “More recently”, “The switch to Postgres”, “Realization”, “Pricing”, “Support” and “Extensibility”.

In summary:

  • Part one focused on “why not Oracle anymore, so much”
  • Part two discussed on the comparison between PostgreSQL and Oracle
  • Part three talks some more on what Postgres then actually is

Community

One of the more important things to be really, really aware of is that Postgres is “not just open source“. Postgres is “community open source“.

Now, why would that be important, you might wonder.

We all know what open source stands for. There are many advantages to an open source system, and in our case, an open open source database.
A number of arguments are in this blog post series. If you take this one step further though, and realize that Postgres is a community open source project, what are  extra advantages?

A community open source project is not limited, in any way, to any one specific group of developers (let’s call them a company). For example, let’s look at MongoDB. This is an open source database, but it is developed by MongoDB inc.
It is, in essence, controlled by MongoDB.

Postgres is developed by the Postgres Developer Community coached by the Postgres Core Team.
This makes Postgres incredibly open, independent and it enables its developers to truly focus on actual business problems that need to be solved. There is no ulterior drive to satisfy commercial goals or meet any non-essential requirements.

Development

A very important discriminator, that only became this clearly and apparent to me, after I dove into Postgres some more, is the development…

The actual development of the database core-software is done by this community, we’ve just identified.

“Well, yes…” you might say, this is what open source stands for. But the impact of this extends well beyond support, as I mentioned in part 2 of this series. The ability to be part of where Postgres goes, to have actual influence on the development, is awesome, especially for a database platform in the current “world in flux”.
Postgres users don’t necessarily have to wait until “some company” decides to put something on the road-map or develop it at their discretion. These company-decisions will mostly be driven by the most viable commercial opportunity, not necessarily the most urgent technical need.

The development of Postgres is more focused on “getting it right”.
One nice example is the Postgres query optimizer. The Postgres community hates bugs. When bugs start to get discussed, it results in many emails within the community, which stand for a lot of reading!
Many bugs are fixed very quickly, so that this email storm stops!
For optimizer bugs therefore, turn-around times (from reporting to having a production-fix) can be as low as 72 hours, so even for mechanisms as complex as a query optimizer.

Invitation

I would like to invite all of you in the Oracle community to take a look at the Postgres query optimizer and share your concerns, worries, bugs or praise with the Postgres community.
If you want to, you can share this with us at the https://www.postgresql.org/list/pgsql-hackers/ email list. We are looking forward to your contributions!

Future

Oracle

I can only speak from what I see. What I see is that Oracle is becoming an on-line services company. I see them moving away from core technology like the database and accompanying functions. Oracle is more an more and moving to highly specialized applications aimed at very big companies.
Chat-bots, social media interactions, integrated services and more, delivered from a tightly integrated but also tightly locked set of Oracle owned and operated data-centers, or rather, the Oracle cloud.

Is this useful? Of course, there will be targeted customers of Oracle who will continue to find this all extremely useful, and it will be, to them.
It this for me? No, not really.

Postgresql

In the beginning, Linux was not something anyone wanted for anything serious. I mean, who wanted to run anything mission critical on anything else than Solaris, HP-UX, VMS, IBM? No-one…
And that was just a few years ago. Imagine!
Today in any old data-center, if you would eliminate the Linux based servers, you would not have much left.
This same thing is now also happening in, what I guess is the second wave of open source. More complex engines are being replaced by open source and the ever present relational database engine is one of those.

Why? Price, extensibility, flexibility, focus, you name it. We have seen it before and we will see it again.

EnterpriseDB

If you permit me just these few words.

I think EnterpriseDB is extremely important for PostgreSQL. We have been fighting on the forefront since the beginning, supporting PostgreSQL’s move to the Enterprise. EnterpriseDB has been and will continue to spend a large amount of our resources to PostgreSQL. We are a PostgreSQL support company. We just have been not very good at patting ourselves on the back…
As a company we are doing extremely well, simply because Postgres is rock solid in all facets and ready to take on the word, even the most daunting tasks – and beyond.
This will continue as Postgres will continue in this second wave of Open Source.

I thank you for your attention.
If you have addition questions or comments, please do not hesitate to contact me.


Why I picked Postgres over Oracle, part II

Continuing this short series of blog posts on some of my drivers for moving to Postgres from Oracle.
Please do read Part I of the series if you have not done so. It discussed the topics “History”, “More recently” and “The switch to Postgres”.

Realization

In the last months, discussing Postgres with my Oracle peers, playing with the software and the tooling, I actually quite quickly realized Postgres is a lot cooler, at least to me. Not so much of the overly complicated technology, but rather built to be super KISS. The elegance of simplicity and it still gets the job done.
Postgres handles a lot the more complex workloads than many (outsiders) might think. Some pretty serious mission-critical workloads are handled by Postgres today. Well, basically, it has been doing this for many, many years. This obviously is very little known, because who would want to spend good money on marketing  for Open Source Software, right? You just spent your time building the stuff, let somebody else take care of that.
Well… we at EnterpriseDB do just those things, …too!

And, please, make no mistake, Postgres is everywhere, from your fridge and video camera, through TV set-top boxes up to major on-line banking software. Many other places you would not expect a database to (be able to) run. Postgres is installed in places that never get touched again. Because of the stability and the low to no-touch administrative character of Postgres, it is ideally suited for these specific implementations. Structured on some of the oldest design principles around Postgres, it doesn’t have to be easy to create the database engine, as long as it “just works” in the end.
Many years ago, an Oracle sales director also included such an overview in his pitch. All the places Oracle touches everybody’s lives, every day. This is no different for Postgres, it is just not pitched anywhere, by anyone, as much.

I have the fortunate opportunity to work closely together with (for instance) Bruce Momjian (PostgreSQL core team founding member and EnterpriseDB colleague). I also had the opportunity to learn from him some of the core principles on which Postgres was designed and built. This is fundamentally different from many other software projects I know and I feel it truly answers some of the core-requirements of database projects out there today! There is no real overview of these principles, so that’s on my to-do list.

Working with PostgreSQL

Pricing

Postgres is open source… it is true open source. It is even a true community open source project, but more about that later in the next installment.

Open source software is free to use, it does not cost nothing!

But, wait! Open source does not mean for free! How…, why…, what do you mean??

Well… you need support, right!?
The community can and will help you, answer questions, solve some of your problems. But they will not come in to install, configure and run Postgres for you. You will need to select and integrate your specific selection out of the wealth of tools. You basically have a whole bunch of additional tasks to complete to get your Postgres platform sorted out.
Companies like EnterpriseDB can help you mitigate these tasks. This allows you to focus on the things you actually want to achieve, using Postgres

In comparison to traditional database vendors, the overall price of your solution will absolutely significantly reduce when using Postgres as your open source database engine.

Support

A significant difference between Oracle (for instance) and Open Source support services is interchangeability.
In the end, Oracle support can only be given by Oracle. They are the only ones that have access to the software sources and can look up (and hopefully fix) issues. In the support of Postgres, or any true community open source product, different companies can provide support. If you don’t like the company you work with… you switch. This drives these companies to be really good at delivering support! How is that for an eye opener.

Extensibility

One of the superb advantages of Postgres is its native extensibility. I mean, think about it for a moment… having a relational database platform with the strength of Postgres, the strength of Oracle or Microsoft SQL Server for that matter. Postgres gives you more options to integrate a wealth of data sources, data types, custom operators and many more other extensions than you will ever need! The integration into Postgres is so solid, these extensions function like any other function in the core of Postgres.
And, rest assured, chances that you will ever be faced with having to built this yourself are extremely slim. There are 30 to 40 thousand developers working with larger and smaller pieces of code of Postgres. Chances are that if you find yourself challenged, somebody else faced and solved that challenge before you. That solution will then be available for you to take and use, solve your challenge and move on. That is also open source for you.

This capability is what makes Postgres ultimately suited to fit the central role in any polyglot environment, we see being built today.
Maximizing the amount of information from data available in multiple data silos in an organization. This is a challenge we see more and more often today. Integrating traditional  applications as ERP, CRM, with data-warehousing results, again combined with Big-data analysis and event-data-capture aggregates. This generates additional decision-driving information out of the combination of these silos. Postgres, by design, is ultimately suited for this. It saves you for migrating YUGE amounts of data from one store to another, just to make good use of it.
The open source Dogma “Horses for courses” eliminates double investments, large data migrations or transformations, it just enables you to combine and learn from what you already have.

End of part II

A link to part three of this blog post will be placed here shortly.

Why I picked Postgres over Oracle, part I

As with many stories, if you have something to tell, it quickly takes up a lot of space. Therefore this will be a series of blog posts on Postgres and a bit of Oracle. It will be a short series, though…

Let’s begin

History

 I have started with databases quite early on in my career. RMS by Datapoint… was it really a database? Well, at least sort of. It held data in a central storage, but it was a typical serial “database”. Interestingly enough, some of this stuff is maintained up to today (talk about longevity!)
After switching to a more novel system, we adopted DEC (Digital Equipment Corporation) VAX, VMS and Micro VAX systems! Arguably still the best operating system around… In any case, it brought us the ability to run, the only valid alternative for a database around, Oracle. With a shining Oracle version 6.2 soon to be replaced by version 7.3.4. Okay, truth be told, at that time I wasn’t really that deep into databases, so much of the significance was added later. My primary focus was on getting the job done, serving the business in making people better. Still working with SQL and analyzing data soon became one of my hobbies.
From administering databases, I did a broad range of things, but always looping back to or staying connected to software and software development using databases.
Really, is there any other way, I mean, building software without using some kind of database?
At a good point in time, we were developing software using the super-trendy client-server concept. It served us well at the time and fitted the dogma of those days. No problems whatsoever. We were running our application on “fairly big boxes” for our customers (eg. single or double core HP D 3000 servers) licensed through 1 or 2 Oracle Database Standard Edition One licenses, and the client software was free anyway…
Some rain must fall
The first disconnect I experienced with licensed software was that time we needed to deploy Oracle Reports Server.
After porting our application successfully to some kind of pre-APEX framework, we needed to continue our printing facilities as before. The conclusion was to use Oracle Reports Server, which we could call to fulfill the exact same functionality as the original client-server printing agent (rwrbe60.exe, I’ll never forget) did. There was only no way we could do this, other than buying licenses for (I though it was) Oracle BI Publisher, something each of our clients had to do. This made printing more expensive than the entire database-setup, almost even the biggest part of the entire TCO of our product, which makes no sense at all.

More recently

This disconnect was the first one. Moving forward I noticed and felt more and more of a disconnect between Oracle and, what I like to call, core technology. Call me what you will, I feel that if you want to bring a database to the market and want to stay on top of your game, your focus needs to be at least seriously fixed on that database.

Instead we saw ever more focus for “non-core” technology. Oracle Fusion, Oracle Applications (okay, Oracle Apps had been there always), and as time progressed, the dilution became ever greater. I grew more and more in the belief that Oracle didn’t want to be that Database Company anymore (which proved to be true in the), but it was tough for me to believe. Here I was, having spent most of my active career focused on this technology, and now it was derailing (as it felt to me).

We saw those final things, with the elimination of Oracle Standard Edition One, basically forcing an entire contingent of their customers either out (too expensive) or up (invest in Oracle Standard Edition Two, and deal with more cost for less functionality). What appeared to be a good thing, ended up leaving a bad taste in the mouth.
And, of course… the Oracle Cloud, I am not even going to discuss that in this blog post, sorry.

The switch to Postgres

For me the switch was in two stages. First, there was this situation that I was looking for something to do… I had completed my challenge and, through a good friend, ran into the kind people of EnterpriseDB. A company I only had little knowledge of doing stuff for PostgreSQL (or Postgres if you like, please, no Postgré or things alike please, find more about the project here), a database I had not so much more knowledge of. But, their challenge was very interesting! Grow and show Postgres and the good things it brings to the market.

Before I knew it, I was studying Postgres and all the things that Postgres brings. Which was easy enough in the end, as the internal workings and structures of Postgres and Oracle do not differ that much. I decided to do a presentation on the differences between Postgres and Oracle in Riga. I was kindly accepted by the committee even when I told them, my original submission had changed!
A very good experience, even today, but with an unaccepted consequence. -> The second part of the switch was Oracle’s decision to cut me out of the Oracle ACE program.

It does free me up, somehow, to help database users across Europe, re-evaluate their Oracle buy-in and lock-in. Look at smarter and (much) more (cost)-effective ways to handle their database workloads. This finalized “the switch”, so to speak.
Meanwhile more and more people are realizing that there actually are valid alternatives to the Oracle database. After the adoption of the Oracle database as the only serious solution back in the early 1990’s, the world has changed, also for serious database applications!

End of Part I

Please follow this link to the second part of this blog post.

Riga Dev Days 2017, new experiences in many ways.

Riga Dev Days 2017

General

It has been a while since my last blog-post.
One of the reasons is my shift from closed to open source software, databases more specifically. More on that in a later blog-post.

The reason for already mentioning this is this strange hybrid (what a popular word, these days) situation that I am in at the moment.
Thanks to the super enthusiastic, flexible and tenacious organization-team of the Riga Dev Days, I was able to participate.
Happily boarded the Air Baltic flight, I went on my way to Riga!!

Being new at the broader conference scene, I enjoyed being at a mixed source developer conference. Besides the usual suspects – some of which are my best friends – I got to meet many interesting new people.
One of the key phrases of the day is: “the more you learn, the more you realize you know nothing – John Snow…” and it’s true! You never stand to think about it, but the wealth of subjects is just tremendous and the combined knowledge at events like these is down right “Yuge, it’s awesome, tremendous!”

Day one

With a day like this, time flies. Between session (and during sessions) there are discussions, a bit of work and catching up to do.
Still I managed to catch a few sessions, like the one from Michael Hüttermann who made a clear and well rounded case regarding CI/CD in a DevOps world. A nice insight into the effort that goes into what’s behind the proverbial “push of a button”.
Another example was that by Marcos Placona about the many (and very basic) things that you have to keep in mind wen building apps. There is no silver bullet and the best you can achieve is to discourage the hacker so much, they move on. Much like securing your house, do to speak.

The day ended in the medieval basements of Riga, where we had some really good medieval food. Life is good…, well…, it has it’s moments!

Day two

The keynote address by Edson Yanaga, which kicked off day two of the Riga Dev Days, was quite interesting.
Shortening development and deployment cycles and shrinking feature release sets actually helps improving software and deployment quality by creating faster and more accurate feedback loops. By looking at these concept in this way, buzzez like DevOps and Agile actually get some hands and feet. One of the lessons, though, is that doing things this way do not eliminate work or automagically solve various issues for you! It will help in getting predictability and continuity into your software development processes.
A nice eye-opening remark finally, was… “no, I don’t pay you to make something work on your computer, I pay you to make something work on my computer(s)!!”

Another talk I was able to attend was around Blockchains. Something I knew nothing of and was actually quite interested in. Nick Zeeb took us through a very lively and very animated tour of what actually a Blockchain is and what the awesome potential of this technology can be. I was impressed.

With this, the second day draw to and end and therewith also my turn “in the pit”. As this event is held in a movie-theater, every room had a sloped tribune, which was often packed with enthusiastic participants. I had the opportunity to share my thoughts on the comparison between PostgreSQL and Oracle.
The session was very well attended with a lot of questions regarding the possibilities of using these other technologies in scales that were not really considered before. You can find a recording of the actual presentation here as soon as it comes available.

Riga Dev Days was a good conference. I would recommend everyone to either attend or submit an abstract for their event in 2018!!

Why GUI sucks…

Of course we all know GUI stands for Graphical User Interface, just as CLI stands for Command Line Interface, right!
Or, rather, a GUI is this nice, flashy screen where you can easily roam with your mouse, comparable to a multiple choice quiz, where the right answer is there for the picking.
A CLI on the other hand is this dark, mysterious blinking cursor… Nothing happens unless you know more or less what you are doing. Comparable to an open questions quiz.

Sparked by a recent Twitter discussion, I decided I should probably write the umpth blog post about this to make my contribution to this lasting dispute.

Disclaimer:
This post discusses GUI in relation to system administration, not necessarily in relation to data-entry or data manipulation applications that are used in front offices all over the world. I guess CLI has no place in a world like that…

bad gui

Why GUI sucks?
I have done my fair share of installing, scripting, ad-hoc fiddling, testing and trying. And, I have found myself in the situation where I worked with younger computer geeks or even in situations where nobody had the time to figure anything out – stuff just had to be made to work.

Probably in the few lines above, we could already have the basics for this discussion!

But, why then does GUI suck?

GIU’s suck because they are limiting, labor (or rather RSI) intense and require you, the operator, to be there, physically clicking away on your computer.

Limiting
They are limiting, or at least most of the time they are, because it is often quite hard to get a visual representation of each and every function of a device / program / system etc. If you consider, for instance, a networking device and then try to imagine having to create a GUI that lets the operator configure and define each and every parameter of a specific VLAN or VPN. And then also bear in mind that the GUI has to stay crisp, clean and intuitive.
For this reason, I have seen many vendors who have created a GUI for basic setup only, relying on the professionals to find their way in the CLI. They GUI can then stay intuitive enough to at least get the basics done.

GUI’s that aim not to be limiting, of which there also are a few out there too, need to sacrifice a lot of the things that a good GUI should stand for:

  • Short click paths (3 clicks from anywhere to get where you want to be)
  • Intuitive (don’t have to guess or read a manual to use a GUI)

So, what you end up with, then, is a maze of riddles, where you can easily spend a good day setting up some new functionality. Somehow I believe this is not what the designers had set out for nor is it a valid solution for most tasks at hand.

Labor intense
I personally find GUI’s often, quite labor intense. Not just for the absence of the ability to automate tasks, though. Especially if there is a lot of specific configuration that needs to be done, you often end up left and right clicking until your hands start hurting.
And, in the end, you always end up with the eerie feeling that you missed out on that one specific setting that would really put the icing on your configuration.

Operator presence
Last, but not least… For a GUI to work, you need to be at your workstation. Period.
Anybody who has ever worked on automated testing of applications that rely on a GUI, knows about the hideous crime of having to script test-cases, either working with hidden button-labels, screen coordinates, etc. Where these scripts fail every other day because a developer moved a window to a better spot or used a new button-label. You end up coding your application just to make it testable.
No, GUI requires operator presence, making it useless for automation or scaling.

good_cliThe bliss of CLI
Okay, middle Ages… or Stone Age…
Nothing really fancy, just a black (or, if you are feeling frivoled, you may choose some nice color) square on your screen with a blinking underscore – most often. And then you say; GUI sucks?

One of the challenges in this hyper fast moving world full of smart phones, tablet PC’s and what have you, loaded with intuitive and fast apps, is to realize that actually “hard core IT” is hard core.
You need to learn your stuff first, know what you do and know about the consequences of choices you make. You will have to learn to be able to walk the walk and to talk the talk. Once you have mastered that, this blinking underscore is no longer a roadblock but a invitation! Just like after mastering a foreign language, you will know what to say and do to open up the potential at your fingertips.

And now, reality
Of course, the above is ranting is just one side of the story.
It is even just one side of the story in hard core IT!

As already stated above, sometimes there is no time to really dive into stuff and get to know the tools you need to get to work for you. I am pretty sure we all have been in a place where we needed to get a project done or some functionality realized, where we just did not have the right devices.

What are your options at such a moment?
Get a hardcore IT specialist who does “talk the talk”?
Probably it will not be cheap and probably it will be a very thorough configuration, but just not exactly as you need it to be… Though still a valid option, even in a number of cases it’s a no-go.
This… is where a good GUI comes in handy.
It will allow you, yourself, to organize that which needs organizing in an orderly fashion. Okay, the GUI will have to be accurate and well thought through, but I that goes for all interfacing, that is also true for the CLI.

Seeing this story unfold… I guess I still think GUI sucks. (sorry!)
But GUI has a place, a very well earned place in a super-fast and highly demanding world. Still I am convinced that if you are working in a highly professional environment, having to do intricate stuff on ever live environments, I would say a good script for a CLI is the only way you can create some assurance that whatever change you need to execute will actually have a predictable result.

And putting in the effort of learning how to use any CLI? Well, I guess that’s why it is called “professional IT”.

Complex UI’s versus Simple UI’s

A few days ago I attended the AMIS UX & UI event.

During this interesting event, Niels Mansveld from AMIS presented about UX Frameworks. And he started off his presentation with an illustration about how user interfaces can create an “experience”, so to say. This illustration was a movie clip by Pixar, taken from the movie “Lifted”. It was so funny and, if you would watch it, you immediately know what Niels meant!

lifted uiThe day after I thought to show this movie at home and I found the YouTube link (https://www.youtube.com/watch?v=pY1_HrhwaXU). When I watched the clip, I realized that there was a second part to the preview, which Niels had left out, probably because of time concerns.
What struck me is actually the following…

I have seen quite a few views on user interfaces lately… Most of them talk about having clean and intuitive layouts and that it is important to think thoroughly about this. Shakeeb Rahman and Ultan O’broin are names that pop into my mind when thinking about this, and these gentleman are very clear about this!
Clean, intuitive UI’s make the Enterprise thrive!

Okay, but, as said, the clip went on for a little bit!TOAD saves

The second half tells the story of the Toad saving the situation, by using this same ultra intricate interface! By knowing what knob did which function, he was super-quick in saving the day!
Now, what would that mean?

Having a clean and intuitive layout may not be the ultimate solution in any situation, regardless! Having an application with a learning curve (not immediately judging the steepness of this learning curve is not always bad. If this interface helps the professional do his/her job in just a fraction of the time, because he/she knows what button to push, I think it’s a good thing.
I have to admit, there were one or two remarks about this in the flashing demo by Paco van der Linden… Bit I guess it is too little emphasized.

cockpitThere are several applications where these, more complicated interfaces do a superb job in helping the task at hand. And, as with anything, don’t blindly follow “best practices”, also in designing user interfaces!
Step back and think what would work best in your situation!

Hope this helps!

Okay, and now my database server crashed…

RTO/RPO, who has ever heard of that! That was Star Wars, right?
Storing data and never having to go without or losing any… Yes, that’s more like it.

Server Crashed

Okay, and these two have everything to do with each other!

Talking about these two fancy IT abbreviations I have raised many eyebrows and aided securing businesses!

What is it:
RTO: Recovery Time Objective, or rather, how long should it take before your database is up-and-running again!
RPO: Recovery Point Objective. How much data can you stand to lose?

It is customary to put real amounts of time for these both parameters. This is one of these true points where IT ‘meets’ business, one of those do or die SLA parameters.
How long before you can start working again after something has gone somewhat horribly wrong? Dependent on the business (and for sake of argument), you will get something like; “Oh well, if we are back in business in say an hour, I guess we’ll be fine.” Okay, so we have RTO = 1hr.
And, how much data can you afford to lose? “Losing data, what do you mean?” Well, let’s say you have been on the phone and in the field harvesting order data and putting this in the database… how much of this information can be reproduced when your environment fails? We’ll go with two scenario’s. We will presume “Oh no, NOTHING!” and “Hmmm, well, 10 minutes, if needs be!”, making respectively RPO = 0min. and RPO=10min.

  • RTO = 1 hour
  • RPO = 0 minutes or 10 minutes.

Let us investigate what this means, assuming we have a functional backup running every night and that our drama happens at 15:45 on a working day.

What do we have when we do nothing?
After establishing we have a system crash at hands we need to start working immediately to rebuild something, but do we have something to build upon?
Do we have hardware? And does it somewhat meet specs? Can we run our OS (version) on it? Do we have OS media to install with? Do we have Oracle media to install with? Can we get network, and so on…
And if we have this do we have enough expertise to get it installed?
Well, I guess it’s clear… We need to invest big-time! Few hours getting all the facts straight and getting hardware, a few hours to install and configure the OS, a few more for Oracle, getting it to resemble the former production environment and then restoring the backup!
RTO = starting at 8 hours.
Looking at our RPO? Well, okay, that’s easy! We backup at midnight (0:00) and we crash at 15:45. So we will have lost 15 hours and 3 quarters.
RPO = 15:45 hours.
Acceptable? No, not really!

It’s clear we have to do something.
The first step is to reduce RTO, we need to be able to continue work faster.
We can do this by making sure we have a second server standing by in a different location. Have it installed, have it configured and ready to jump into action. You could call this a Standby Server.
But even now there is no guarantee we make our target since restoring a backup and getting the database up and running could still easily take over 1 hour, when dealing with red-tape and decision levels. To hit the home run we need to add one more feature, we need to have not only a Standby Server, we also need to have a Standby Database. A database that can be “opened” or “activated” in mere minutes.

  • Are you running Enterprise Edition Database then you can use Oracle Data Guard, included in your database license.
  • Are you running Standard Edition Database then you can get the Smart Alternative from Dbvisit.

With Standby Database in place:
RTO = 5 minutes!!

Now we need to tackle RPO!
Or… do we still?
RPO = 10 minutes, actually is tackled by the Standby Database implementation.
Because of the characteristics of Standby Database, we do not only have an RTO of mere minutes, we also have an RPO of a configurable duration.
Data is transferred to the Standby Database environment by means of archived Redo Log files and this mechanism is influenced by manual switching of log files and if you do this with small enough intervals (less than our target of 10 minutes) we make sure that age of the data in the Standby Database meets the target “Recovery Point Objective”!
RPO = 0 minutes
Well, okay, this is something else. And if we think about this a little, it’s something completely different!
Recovery Point Objective, the amount of data we can stand to lose, is 0 (nothing!). Actually meaning we have to create a Standby database setup which is kept up to date with the primary environment. This kind of Standby Database environment allows you to switch to this second environment within seconds and continue your business operation without delay!

And, with your Active-Active Standby Database solutions in place:
RPO = 0 minutes!

So, now you know about RTO/RPO to secure your data and know this guy is something else.

r2-d2

Increasing the reach of your SE database license

Imagine the following situation…

Since a few years your business has been investing in centralizing valuable business information. After some research in the market you have found the Oracle database to be the best fit for your requirements.
Using the free Oracle Application Express (APEX) framework, helping you to rapidly develop the web-applications needed to support both internal and external users, was a premium. Making this installation available based on the Oracle Standard Edition One database, you have created this solution against the lowest possible investment!

As many great projects go, the use and the number of APEX applications is growing. With the addition of ready to use applications to inspire you, many cool plug-ins to ever increase the usability and integration possibilities you get caught up in the data growth dogma!
With an ever increasing user population and expansion of data-reporting for ever faster business reporting your initial system is starting to fail, showing ever more frequent performance lags or system unavailability. These problems form a risk for your business, a risk you need to eliminate as soon as possible!
The standard advise here would be to upgrade your environment, the standard advise here would be to upgrade to a bigger machine and to an Enterprise Edition database. This is what your investment would be then…

  • Medium Oracle Sun Server X2-4 with 4 x 10 core CPU’s at € 42,500
  • (40 cores x 0,5 core-factor **) 20 Oracle Database Enterprise Edition licenses               at € 914,800

Without rendering your application infrastructure worthless by the required investment, a more reasonable step would be to migrate to Oracle Database Standard Edition.

  • Medium Oracle Sun Server X2-4 with 4 x 10 core CPU’s at € 42,500
  • 4 Oracle Database Standard Edition licenses at € 67,400

Still requiring a total investment of more than a hundred thousand Euro and leaving you with the old server and licenses to be decommissioned.

In many implementations, not data entry but data-mining or information aggregation are the costly processes. So probably this will be true in this situation too. With a little investigation it is possible to separate a number of functions that will only query data and not necessarily modify data. Especially in this situation you can also increase your application performance by moving these specific processes to a new environment.

But… how…

The information in the new environment needs to be real-time consistent with the “production” or primary environment. Here we introduce a real-time data replication solution like Dbvisit Replicate which will create just this real-time consistent query environment for you! This makes for the following investment:

  • Medium Oracle Sun Server X4-2 with 2 x 8 core CPU’s at € 19,500
  • 2 Oracle Database Standard Edition One licenses at € 11,200
  • 4 Dbvisit Replicate XTD at € 16,180

With this installation you add another € 50 k. of licensing in stead of € 100 k. with the Standard Edition migration. With this choice, you separate your time-critical data-entry process from the query environment, making sure a mis-fired query will not influence the availability of your data-entry process environment, which is a cool extra advantage!

* All prices are based on list-prices, excluding VAT and including 1 year of support.
** Based on the Oracle Processor Core Factor Table.