Last few days, this was hot topic around Oracle DBA communities "what DBA have to do post 18c?" many people have many opinion, but I think this is the most relevant place to find out answer.
now for next 5 years down the line what are the essential technical skills that need to learn by an Oracle DBA to stay relevant and how this new release going to impact current DBA job
The sky is FALLING!
1 person found this helpful
I would start by reading this... https://oracle-base.com/blog/2017/10/02/oracle-autonomous-database-and-the-death-of-the-dba/
The DBA is not going away and the recent announcement at OOW17 has provided the same level of panic a similar paper started in 2003. http://www.oracle.com/technetwork/articles/sql/twp-manage-self-managing-database-128245.pdf
Back in 10g, the DBA was dead because the database was supposed to be self managing. Here we are again 14 years later and the sky is falling again.
The DBA will still be around. Like any job in the IT industry, change occurs and it happens at a rapid pace. The IT professional needs to stay on top of the latest trends and figure out how those trends can help their business meet its goals. The DBA needs to change with the times or they will be replaced by a DBA that does.
Right now, "cloud" is hot as is big data/analytics and IoT and noSQL, and many other things. Who really can say what you'll need in five years? That's a long time in the IT industry. Keep current with today's technologies. Learn the new features of the next Oracle version and how you can leverage them to improve you administration and your business needs. Keep doing that every day for your DBA career and you'll be just fine.
thanks for reply and it is sound relevant. Now, Honestly I am confused about how big data/cloud or IOT are relevant for Oracle DBA i.e. what is the next best thing to go with ? cloud technologies are more confusing in terms of what DBA role will after migration to cloud ? I know this is sound like not very wise, but this what in my mind.
In the 1950's, IBM attempted to corner the computer market. They wouldn't sell computers to businesses, they would lease them and provide all the software services. The US government called this a trust, and IBM had to sign an anti-trust agreement to allow competition to exist.
Now, Oracle is trying to lease computer services and programming, providing all the services to businesses. How is this different than the 1950's? There are already other companies doing this too. Cloud is the marketing term for this, and more specifically, the top executives at Oracle have their compensation largely based on meeting growth targets for this section of the business. There's nothing wrong with that, but it does lead to some Chicken Little marketing as the others have noted.
As computer technology advanced to the 1970's, minicomputers came about to challenge mainframes in cost. The net effect of this wasn't to kill the mainframes, but rather to slice out situations where a smaller computer could be appropriate, as for departments or medium sized businesses. DEC made a lot of these, and when the VAX computers came along, Oracle managed to leverage them into a big contract with the US government for relational databases on VAXes. At the beginning of this mini time, there were time share companies to supply programming and computer services to businesses, but that proved to be a very narrow margin businesses, and generally couldn't compete with businesses buying their own computers. In the '80s, PC's came along, further slicing out situations that could use smaller computers, and by the middle of the '80s, wiping out some minicomputer companies. But IBM is still around, and even have mainframes emulating PC's running linux. And now we have super-servers doing the same thing.
So back in the minicomputer days, as the VAXes aged, Oracle got together with Sun to make unix/Oracle systems. Larry's idea then was "the Network is the Computer," and he even formed a company to make terminals for that - very similar in concept to how we have browsers now. But that all failed, because PC's.
So one way or another, someone still has to write and manage applications and the databases they require. It is some genius that Larry can sell the same apps to run in the cloud or on premises, and sell the services to do either and more services to move them back and forth. We've seen some huge fails in the cloud, and in modern technology in general. The "self-tuning" database and such are just tools, subject to the shortcomings of any tool produced and used by humans. I'm somewhat skeptical, but I think it just means there will be increasing specialization necessary to use these tools, especially in the performance area. But even plain old application building has become so complex and specialized, yes you can blast some app out quickly if you know some particular framework and methods, but those are just increasing in complexity too. There's a whole technology stack, and you need to be able to figure out where in the stack any problem lies. On the design side, there's always some paradigm of the month that doesn't take the tech stack properly into account, leading to silliness like using a powerful database as a data dump. Or leaving a critical part of the system unpatched because it's expensive to keep up. Or reinventing the wheel, forgetting basic UI maxims.
So what's a DBA to do? Depending on your likes, abilities, and dumb luck, you can either specialize in parts of the stack, functional operations, or a broad range of things as a utility player between the business and the tech.
It just boils down to 'survival of the fittest'. The DBAs that are valuable to the business because they have relevant knowledge and know how to adapt and bring value to the business will retain their jobs and get pay rises while the mediocre DBAs will be made redundant and replaced by scripts or cloud services which will hire more DBAs to implement them.
Bottom line, keep up with relevant technologies and don't get too cushy doing the same thing over and over again.