Skip navigation.

DBA Blogs

Trace Files -- 3 : Tracing for specific SQLs

Hemant K Chitale - Sun, 2015-10-11 04:00
11g allows definition of tracing by SQL_ID as well.

Here is an example.

Given a particular SQL that has been executed in the past, which we've identified as :

SQL> select sql_id, sql_text, executions from v$sql where sql_id='06d4jjswswagq';

------------- ------------------------------------------------------------------------------------- ----------
06d4jjswswagq select department_id, sum(salary) from hr.employees group by department_id order by 1 1


We could use either ALTER SESSION (from the same session) or ALTER SYSTEM (from another session, to trace all sessions) to enable tracing specifically for this SQL alone.

SQL> connect system/oracle
SQL> alter system set events 'sql_trace [sql:06d4jjswswagq] wait=true, plan_stat=all_executions';

System altered.


(note : The options for "plan_stat" are "never", "first_execution", "all_executions").  This allows us to capture execution plan statistics.
Once I have enabled SQL-specific tracing, it is not limited to a session but can run across all sessions that execute the SQL.  Even if I execute other SQLs from the same session that executed this SQL, the other SQLs are *not* traced.

Thus, I started another session that executed :

SQL> select department_id, sum(salary) from hr.employees group by department_id order by 1;

------------- -----------
10 4400
20 19000
30 24900
40 6500
50 156400
60 28800
70 10000
80 304500
90 58000
100 51608
110 20308

------------- -----------

12 rows selected.

SQL> select count(*) from hr.employees;


SQL> select count(*) from hr.departments;


The trace file only captured the target SQL. The other two SQLs were *not* in the trace file.  Tracing is not bound to a session, so if you have multiple sessions executing the target SQL, each session creates a trace file.

Tracing is disabled with :

SQL> alter system set events 'sql_trace [sql:06d4jjswswagq] off';

System altered.


Thus, just as in the previous post where I demonstrated tracing by module and action, we can enable tracing for a specific SQL.

Categories: DBA Blogs

Log Buffer #444: A Carnival of the Vanities for DBAs

Pythian Group - Fri, 2015-10-09 13:53

This Log Buffer Edition covers some blog posts of Oracle, SQL Server and MySQL from this past week.


  • Oracle Utilities Application Framework V4. includes a new help engine and changes to the organization of help.
  • Very simple oracle package for HTTPS and HTTP.
  • Oracle ZFS Storage Appliance surpasses $1B in Revenue (Oct 6th).
  • Tim spotted a problem with the PDB Logging Clause.
  • How to Pass Arguments to OS Shell Script from Oracle Database.

SQL Server:

  • How efficient is your covered index?
  • Storing Passwords in a Secure Way in a SQL Server Database.
  • There is plenty that is novel and perhaps foreign to a new R user, but it’s no reason to abandon your hard-earned SQL skills.
  • SSIS Design Pattern – Staging Fixed Width Flat Files.
  • Shorten Audits, Tighten Security with Minion Enterprise.


  • Confusion and problems with lost+found directory in MySQL/Galera cluster configuration.
  • Simplifying Docker Interactions with BASH Aliases.
  • Getting started MySQL Group Replication on Ubuntu with MySQL Sandbox.
  • MySQL lost “AUTO_INCREMENT” after a long time.
  • Using Apache Spark and MySQL for Data Analysis.


Learn more about Pythian’s expertise in Oracle SQL Server & MySQL.

Categories: DBA Blogs

My Sales Journey: #6

Pythian Group - Fri, 2015-10-09 13:43


Last week, I delved into the the mind of our executive tier to pin point how to make outreach appealing to that group. Although most companies want to speak to these decision makers it is an uphill battle to get the ear of a VP/Exec. Sometimes, we must reach the people who are talking to their bosses each day and these are the managers in the trenches living with the problems we want to solve.
So today, lets step inside the mind of a Manager:

Managers maintain the integrity and continual functioning of mission critical operations. They evaluate and make recommendations regarding new technologies, tools and techniques. They are keeping a head count and know where the gaps exist in their teams. Managers are responsible for their team performance whilst also expected to do more with less.

They are also the people who will most likely be willing to listen if you have a solution  to their problems. Tell them how you or your company has solved similar problems with others. Give them proof.

Assure them of their importance. Make seeking your help a positive experience for them and their team. Managers need to know that you are not taking their teams job’s or theirs. As a managed service firm we like our clients to know that we can augment and work as an intimate extension of their in-house team.

If you should be so lucky that a forward thinking manager wants to introduce you to their executive team. Give them credit. There are tons of people who are too afraid to do things differently and resistant to change so when someone takes that leap for you do not leave them behind.

You can either work up the food chain or down as a Sales professional. Know that catching the big fish is hard but rewarding and most times you will need to get creative with your hook. Isn’t that what Sales is all about in a nutshell? I personally, do not have a preference but I am also new to the game.

I’d love to hear your thoughts about your sales process, who do you talk to first, what gets you a higher response rate, do you go after the big fish or work your way up the ladder – leave me a message!

Categories: DBA Blogs

When it comes to Black Friday’s online rush, even we Brits don’t want to queue

Pythian Group - Fri, 2015-10-09 08:10

Black Friday is a pretty new concept in the UK. Traditionally an American post-Thanksgiving shopping day, it has recently gained popularity in the UK. It really took off last year and defined the start of what I heard being called the Golden Quarter – from November through to the end of January – when retailers will make 40-50 percent of their annual sales.

With anything new, people take a while to find their feet, and will try out new things. I hope that one of the technologies that was trialled widely on e-commerce sites during this period last year, isn’t used again this year: The queuing system.

The idea was sound: instead of having too many customers hitting a site and causing poor performance for everyone, a number were put into a virtual waiting room and allowed onto the site in order. The meant that, once in, everyone could shop quickly without any risk of a crash from overload.

But in practice, it seemed to customers as if the site wasn’t working. The Press reported that sites had “crashed” anyway and the user experience was awful. You might queue to get into Harrods on the first day of a Sale, or you might queue online to get a ticket to the last One Direction concert, but with plenty of choices available, users simply hopped elsewhere.

To me, the most frustrating thing was that this seemed like a lazy solution. It is not difficult to build your e-commerce site for the peaks you should be expecting.

Why not ensure you can spin up extra capacity with a cloud provider, if needed? And why not take the time to configure the database structure? This would mean that, for those few days when 1000 people a minute are wanting to see the shoes you have in stock, they all can. Easily and without delay.

Building an e-commerce site – or indeed any application – to be scalable to handle peak traffic should be a high priority for your developers, DBAs, and sys admins. With the range of available cloud technologies, database technologies, and automation tools there is no excuse.

Let’s hope that for Black Friday 2015, for once the UK is queue free!

By the way, if you need a hand on any of the areas discussed above, please do get in touch. As a majority of Pythian’s retail clients are in the US, we’ve had many years of practice, and ensure our clients peak demands are handled smoothly.

Categories: DBA Blogs

Links for 2015-10-08 []

Categories: DBA Blogs

SQL Server Fast Food

Pythian Group - Thu, 2015-10-08 13:55


An environment where you have a high number of databases on one server, or many, can be time consuming to something as simple as adding a user account. You have the option of using the GUI with SQL Server Management Studio (SSMS), which if it was a rush to get something in place for 8 or 10 databases I can see possibly doing that to get it done. You could do this with a bit of typing using T-SQL and a cursor or that famed, undocumented procedure sp_MSForeachdb.

I recently had a request from a customer that fell into the above scenario and in using PowerShell to handle the request I just wanted to show how I went about getting it done. I think this is a situation where both T-SQL or PowerShell will work, I just picked the one I wanted to use.

Breaking this down, these are the basic steps I had to perform:

  1. Check for the login
  2. Create user
  3. Create role
  4. Assign INSERT and UPDATE to the role
  5. Add the user to the database role

All in all that is not too much, if you understand how PowerShell and SMO work for you. If you are not familiar with PowerShell you can reference the recent series I published on the Pillars of PowerShell that should help you get started. When I was learning PowerShell I always found I learned the best by reading through other folks scripts to find out how stuff was done. You can find the full script at the end of this post if you want to just skip right to it, I won’t be offended.

One thing I always find useful with SMO is remembering that everything MSDN documents everything for the namespace Microsoft.SqlServer.Management.Smo. If you spend the time to review it and at least get familiar with how the documentation is laid out, using and finding answers for things with SMO becomes much easier.


The Bun

As always the first step is going to be to create the object for the instance or server:

$s = New-Object Microsoft.SqlServer.Management.Smo.Server $server

The task of verifying the login exists, I utilized one of the common methods that is available with a string type, Contains(). Now you generally use the Get-Member cmdlet to find the various methods available for an object, but this particular one does not show if you were to run: $s.Logins | Get-Member. There are a set of methods that follow each type of value (e.g. String, integer, date, etc.) and the Contains() method is one with the string type. There are two ways I have found to discover these type of methods:

  1. Pass the value type to Get-Member [e.g. “A string” | Get-Member]
  2. Use tab completion [e.g. Type out “$s.Logins.” with the period on the end, and then just start hitting the tab key]

If you want a bit of exercise you can see if you can add in code to actually create the login if it does not exist. I was only working with one server in this case so did not bother adding it this time around.

Being that I need to add these objects to each database I start out by getting the collection of databases on the instance:

$dbList = $s.Databases

From there I am simply going to iterate over each database that will be stored in the variable: $d.


The Meat

The first thing I want to do is verify the database is online and accessible, so each database (e.g. $d) has a property called “isAccessible” that simply returns true or false. The equivalent of this in T-SQL would be checking the value of the status column in sys.databases for T-SQL. One shortcut you will see in PowerShell at times is the use of an explanation point ( ! ) before an object in the if statement, this basically tells it to check for false to be returned:

if (!$d.isAccessible) {…}
#equates to:
if ($d.isAccessible -eq $false) {…}

Now that I know the database is online I need to create and modify some objects in the database. When dealing with objects such as user accounts, roles, tables, etc. in a database, in PowerShell these are going to be classes under the SMO namespace. So in this script I am going to use the following classes for the user and database role:

Under the User and Database Role class you will see the constructors section that shows what is needed to create the object. So for example, digging into the link for the database role constructor I see it takes two parameters:

  1. Microsoft.SqlServer.Management.Smo.Database object
  2. a string value of what you want to call the role.

The $d variable is my database object, so that is covered and then I wrote the function to pass the database role name into the $roleName:

$r = New-Object Microsoft.SqlServer.Management.Smo.DatabaseRole($d,$roleName)

I continued through the article for the database role class and in the Properties list see that some have a description of “Gets the…” and then some have “Gets or sets…”. This basically means “Gets the…” = read only property, and “Gets or sets” = property can be read or modified. When you are using CREATE ROLE, via T-SQL, you have to provide the name of the role and the owner of that role. I passed the name of the role when creating the database role object ($r) so I just need to set the owner and then call the method to actually create it:

$r.Owner = 'dbo'
The Ingredients

The only thing I needed to do in this situation was set INSERT and UPDATE permissions, and at the schema level to handle the client’s requirements. Assigning permissions in SMO took me a bit to figure out, majority of the time on writing this script actually. There are two additional classes I need to handle setting permissions on a schema:

I create the object for the schema, according to the documented constructor. Within each class that deals with specific objects in a database that can be given access, you should find a Grant() method and in my case what I need is Grant(ObjectPermissionSet, String[ ]). The second parameter is an object that contains the permissions I want to assign to this role. This is where the second class above came into play.

The properties for the ObjectPermissionSet class are the permissions I can assign via SMO to an object in a database, and simply setting them to true will assign that permission:

$dboSchema = New-Object Microsoft.SqlServer.Management.Smo.Schema($d,'dbo')
$perms = New-Object Microsoft.SqlServer.Management.Smo.ObjectPermissionSet
$perms.Insert = $true
$perms.Update = $true

Then to finish it off that last line in the script is to just add the user as a member of the database role created. You can find the full script below for your pleasure. Enjoy!


Full Script
Import-Module SQLPS -DisableNameChecking -NoClobber
function Create-RoleUserInAllDatabases
Create database role, assign permission, create user, assign user to database role
Iterates through all databases that are online and creates the role and user (if the login exist).
Assigns INSERT and UPDATE permissions to the role created.
You can find the other properties that can be set on MSDN site:
String. Name of the instance or server (for default instance)
String. Current login on the instance, can be Windows or SQL Login
Switch. Name of role you want to create.
String. Array, or single, permission you want to assign. **See notes**
Create the role AppRole and add "SQLLogin1" as member of that role
Create-RoleUserInAllDatabaes -server MyServer -loginToUse SQLLogin1 -roleName AppRole
param (
[Parameter( Mandatory=$true,ValueFromPipeline=$false )]



$s = New-Object Microsoft.SqlServer.Management.Smo.Server $server

# Make sure login already exist
if (!($s.Logins.Contains($loginToUse)))
Write-Warning "$loginToUse does not exist on $server"
$dbList = $s.Databases

foreach ($d in $dbList) {
#if databases is not accessible
if (!$d.isAccessible) {
Write-Verbose "$($d.Name) is offline"
else {
Write-Verbose "******WORKING ON*****************$d******************"
# Check if user already exist in database
if (!($d.Users.Contains($loginToUse))) {
Write-Verbose "$loginToUse does not exist, creating"
$u = New-Object Microsoft.SqlServer.Management.Smo.User ($d,$loginToUse)
$u.Login = $loginToUse
else {
Write-Verbose "$loginToUse already exist, skipping step"

# Check if role already exist in database
if (!($d.Roles.Contains($roleName))) {
Write-Verbose "$roleName does not exist, creating"
$r = New-Object Microsoft.SqlServer.Management.Smo.DatabaseRole($d,$roleName)
$r.Owner = 'dbo'
else {
Write-Verbose "$roleName already exists, skipping step"
} #end check if role exist

# grant permissions
$dboSchema = New-Object Microsoft.SqlServer.Management.Smo.Schema($d,'dbo')
$perms = New-Object Microsoft.SqlServer.Management.Smo.ObjectPermissionSet
$perms.Insert = $true
$perms.Update = $true

# now add user

} #end check database is online
} #end foreach $dblist
} #end function


Discover more about our expertise in SQL Server.

Categories: DBA Blogs

Issues with Plan Cache Reuse & Row Goal Optimization

Pythian Group - Thu, 2015-10-08 13:11


I am presenting here on behalf of my colleague  Fabiano Amorim (he is busy resolving other exciting performance issues…  :-D ) .

Fabiano had an interesting case with one of our customers that is very common in SQL Server.

The case is about a performance issue caused by two optimizer decisions not working well together:


Problem Description

Let’s review the following query:

select top 1 col_date from tab1
where col1 = 10
and col2 = 1
and col3 = 1
order by col_date asc


Table tab1 have two indexes:

  1. ix1 (col1, col_date, col2) include(col3)
  2. ix2 (col1, col2, col3) include(col_date)


The Query optimizer (QO) has two query plan options:

  1. select -> top -> filter -> index seek (ix1) Read the ordered index ix1 by b-tree seeking by “col1 = 10”, apply the residual predicates (filter) “col2 = 1 and col3=1”, after reading just 1 row (TOP 1) the execution is finished since the index is ordered by  col1, col_date, the first col_date returned is already the TOP1 ASC according to the index order.
  2. select -> top N sort -> index seek (ix2) Read the covered index ix2 b-tree (notice it has all needed columns), seeking by “col1 = 10 and col2 = 1 and col3=1”, get the col_date in the index leaf level (included column), use “top N sort” algorithm to sort and keep only TOP 1 row, finish execution.

The problem, is that, if the QO chooses the first option, this will be good for high selectivity predicates.
For instance, let’s suppose that “col1 = 10” returns 5 rows; remember that index ix1 is ordered by col1, col_date, col2:


col1 | col2| col3 | col_date

10    | 4      | 4      | 2015-12-01

10    | 3      | 3      | 2015-12-02

10    | 1      | 1      | 2015-12-03

10    | 5      | 5      | 2015-12-04

10    | 2      | 2      | 2015-12-05


After seeking the index, SQL will need to apply the residual predicate (“col2 = 1 and col3=1”) until it finds the “row goal”: TOP iterator is asking for just one row, in this case the third row will match the predicate and SQL Server will return the first row that matches the residual predicate.

So, in this case it has to read only 3 rows. So far so good…

Now, let’s supposed SQL created that plan, and now it’s going to reuse it for a new value on col1 filter:


select top 1 col_date from tab1
where col1 = 99
and col2 = 1
and col3 = 1
order by col_date asc


What if after the seek (“col1 = 99”) 2 million of rows are returned? Now this plan is not so good, since it will need to apply the predicate on many rows before it finds a match:


col1 | col2| col3 | col_date

99    | 2      | 2      | 2015-12-01

99    | 2      | 2      | 2015-12-02

…after a couple of million rows…

99    | 1      | 1      | 2015-12-03

99    | 2      | 2      | 2015-12-04

99    | 2      | 2      | 2015-12-05


In this case, using the second option is better. Just go and seek the b-tree for all values (col1 = 99 and col2 = 1 and col3 = 1), this will return 1 row… TOP n SORT will do almost nothing and execution will finish quickly.

Here is the problem: most of the times, SQL knows whether to use option 1 or option 2 based on the parameters values. But if it is reusing the plan from cache, the optimization path may already be set improperly resulting in the known issue called “parameter sniffing” (plan reuse that is wrong for the specific set of rows)… That means that the row goal optimization should not be used if there is a covered indx.

Unfortunately by default, QO “thinks” this is cheaper than “seek+top n sort”… Of course it all depends on the distribution of data…So in a nutshell,  QO chooses rowgoal optimization where this should not be used therefore we should pay extra attention to those kind of plans…


Possible Solutions

There are many alternatives to fix it.

Some examples:

  1. Force the index (index=ix2)
  2. Option(recompile)
  3. drop the index ix1, define ix2 as a unique (tells QO that only 1 row will be returned)

Each one of the above has advantages and disadvantages.

We also need to ensure that statistics are up to date!


Additional Resources


Discover our expertise in SQL Server. 

Categories: DBA Blogs

Simplifying Docker Interactions with BASH Aliases

Pythian Group - Thu, 2015-10-08 12:21
Landing a Docker Whale

Docker has been consuming my life in the last few weeks. I have half a dozen projects in progress that use containers in some fashion, including my Visualizing MySQL’s Performance Schema project.

Since I prefer to work from a Mac laptop, I have to utilize a Linux Virtual Machine (VM) which runs the Docker daemon. Luckily, Docker Machine makes this a very simple process.

However, interacting both with Docker and Docker Machine does introduce some additional commands that I would rather simplify for the repeatable use-cases I’ve come across. With BASH aliases, this is not a problem.

Is My Docker Environment Setup?

When working with Docker through Docker Machine, you first have to set up your environment with various DOCKER_* variables, such as these:

View the code on Gist.

The first alias is an easy way to check that the Docker environment is setup.

View the code on Gist.

Now, all I have to type is de, and I get the Docker environment output:

View the code on Gist.Setting up My Docker Environment

But how do you set up the environment with Docker Machine? The docker-machine command provides the details:

View the code on Gist.

Notice that the comments indicate you have to run the command through eval to get the terminal setup correctly. I don’t want to type that out each time I open a new terminal.

The docker-machine command requires the name of the VM to set up as an argument, so I’ve created a function to accept the argument:

View the code on Gist.

Each time I open a terminal I can setup the environment:

View the code on Gist.

If you only use one Docker VM for local development, you can hardcode the name of it to execute the command to automatically setup the docker environment when a new terminal is created.

Cleaning Out Docker Images

The last helpful alias I have comes from building and re-building containers that have left old images on my VM.

View the code on Gist.

The docker-clean command cleans up all dangling images:

View the code on Gist.

And running the docker-clean command yields:

View the code on Gist.

I put all of these aliases and functions together in my ~/.bash_profile* script, which is executed anytime I open a terminal window:

View the code on Gist.

*Note: Instead of putting these aliases and functions in ~/.bash_profile, other distributions would look for them in ~/.bashrc or ~/.bash_aliases to ensure they are available for all types of interactive shells.

If you have any other commands to simplify Docker interactions, please share them in the comments!


Discover more about our expertise with DevOps.

Categories: DBA Blogs

Partners Guide to Oracle Cloud - The Oracle Cloud Playbooks

OPN has published the Oracle Cloud Platform Strategic Partner Playbook for a while now, designed exclusively for partners. This was created in close partnership with Product Marketing, Product...

We share our skills to maximize your revenue!
Categories: DBA Blogs

My Delphix presentation at Oaktable World

Bobby Durrett's DBA Blog - Wed, 2015-10-07 17:52

It is official.  I will be doing my Delphix presentation at Oaktable World during the Oracle OpenWorld conference at the end of this month.  My talk is at 9 am on Tuesday, October 27.

I will describe our journey as a new Delphix customer with its ups and downs. I tried to have the spirit of a user group talk where you get a real person’s experience that you might not get from a more sales oriented vendor presentation.

Kyle Hailey, a Oaktable member and Delphix employee, will host my talk.  I have been very impressed by Kyle’s technical knowledge and he will be with me to answer questions about Delphix that I could not answer.  I think it will be a good combination of my real world user experience and his depth of technical background in Delphix and Oracle performance tuning.

If you are going to OpenWorld and if you want to know more about Delphix come check it out.  Also, feel free to email me or post comments here if you have any questions about what the talk will cover.


Categories: DBA Blogs

Partner Webcast – Rapid Digital Transformation with Oracle Process Cloud

Today, IT is heavily optimized to develop and manage longer running durable applications with evolutionary change, current demand calls for creation of disposal applications and fast frequency...

We share our skills to maximize your revenue!
Categories: DBA Blogs

Partner Webcast – Oracle Mobile Cloud Service: Gates to Enterprise Mobility for Your Business

Nowadays Mobility has definitely disrupted business models. Mobile first companies that are using the context of mobile to create unique applications are creating new business models disrupting and...

We share our skills to maximize your revenue!
Categories: DBA Blogs

Oracle E-Business Suite: Virtual Host Names

Pythian Group - Tue, 2015-10-06 07:56

The ability to use virtual host names with Oracle E-Business Suite is one of the features that I have been waiting for a long time. When I finally saw a post on Steven Chan’s blog about it, I was very excited. But, when I finally got to review the Mos note “Configuring Oracle E-Business Suite Release 12.x Using Logical Host Names”, I was left with disappointed.

In my opinion, the main advantage of using virtual host names is during a DR failover scenario. By using virtual hosts we can setup the servers in both a primary datacenter and secondary datacenter to use the same virtual hostname, even though their physical hostnames are different. This virtual hostname setup helps when we failover services and databases to a secondary datacenter, as we don’t have to reconfigure the application to use new physical hostnames. Currently when we install E-Business Suite to use a virtual hostname, “Concurrent Managers” dont work, as they internally use the physical hostname to communicate.

The new MOS note describes this very feature of using virtual hostnames with Oracle E-Business Suite. But why I am disappointed? Because it left a very important use case out. In most cases when virtual hostnames are used, the servers are configured with a different physical hostname. i.e., if you run hostname or uname commands you will see that the actual physical hostname and virtual hostname is only present in DNS and hosts file. This scenario is not covered by the MOS note. The MOS note asks us to reconfigure the server with virtual hostname such that when we type hostname or uname command it shows the virtual hostname instead of the physical hostname.

I believe the need to reconfigure the server to use a virtual hostname, defeats the main purpose of setting up virtual hostnames, making this MOS note useless :(

Thus, I will keep on waiting for this out of the box feature. I currently have a custom in-house method to use virtual hostnames with E-Business Suite that I will blog about it in future.


Discover more about our expertise with Oracle.

Categories: DBA Blogs

Submit an abstract for Georgia Oracle User Group (GaOUG) Tech Day 2016

DBASolved - Mon, 2015-10-05 10:15

In 2014, myself, Danny Bryant and Stewart Bryson, from the Atlanta area were added to the board of directors for the Georgia Oracle User Group (GaOUG). With us being added to the Board of Directors, we initatied a rebranding of the user group from GOUSER to GaOUG in Feburary 2015, with much success. Then followed that event up with two quarterly events in April and July 2015, which proved to be an even bigger sucess and we continue to build on each success due to people we have been able to attract for quarterly events. After the success of our quarterly events, the board of directors established a goal of bringing the best speakers, locally, nationally and internationally, to the Atlanta area for a 1 day conference in 2016. So, I’m here to promote the conference and to hopefully convence you to submit an abstract to this new event on the conference circuit.

Little Background:

When I first moved to the Atlanta area (2001), I didn’t know anything about Atlanta much less the Oracle community. I joined GOUSER around 2006 with little going on in the community (story for another time). It wasn’t until 2012, when I first ventured out onto the conference stage at Oracle Open World 2012. For my first conference, it was nerve racking to say the least; however, I was introduced to many great people that year and my career, personal and professional networks, and friendships have benefited from that experiance. These are just some of the reasons why I keep submitting annually for a wide range of conferences and why I’m helping to bring this conference to Atlanta.


The aim of GaOUG Tech Day 2016 conference is to start small and grow into the best regional Oracle User Group conference in the Southest! This can only be achived with the help of great speakers (new and existing) from the Oracle Community! With that being said, our call for papers is currently open (submit here).

GaOUG Tech Day 2016 will have three categories for abstract submissions. These include Database & Development, Middleware and Applications. These categories cover a wide range of technologies form the Oracle stack plus many others.

Submitting an Abstract:

Before you submit your abstracts, there are a few reminders that you should be aware of that make a great abstract:

  1. Take the time to make a great abstract title and fill out the abstract and summary completely!
  2. Run it through spelling and grammar check.  If you submit a “sloppy” abstract with misspellings and errors in grammar, how can we know that you’ll take the time to ensure that the presentation will be performed professionally and technically accurate?
  3. List a few take-aways the attendee will leave with.  What is the value that will be gained by attending your session?
  4. Fill out your speaker biography.  We like to know a bit about you and why you are important to have presenting at the conference.
  5. No marketing!  Keep your session technical.  Our conference is a technical conference and nothing irks our attendees like marketing!  If they like what you are teaching, they’ll search out your company and/or product-  trust me!

(Tips were provided by DBAKevlar, thanks Kellyn!)

First/Local Time Speakers:

Finally, if you are submitting an abstract and you are a first time or local speaker; do not hesitate to reach out to the board and ask for guidance and/or mentoring. The board of directors has the experience to assist you in your submissions and selections. We are looking forward to and wanting to hear from new speakers in the community and help build your confidence in the speaking arena.

Submit your abstract today! Deadline for submissions is November 2nd, 2015!

Filed under: GaOUG, General
Categories: DBA Blogs

Five Years, Five Top 5’s

Pythian Group - Sun, 2015-10-04 22:59


Five Years, Five Top 5’s

On October 4th, 2010 I joined Pythian. At that time Human Resources (HR) was a team of one, supporting 90 employees in 13 countries. Five years later on October 4th 2015, HR is now a team of fifteen, and today Pythian celebrates a milestone as we reach 400 employees in 36 countries!

What an incredible journey (…and yes, it’s all about the journey)!

The opportunity to work with stunning colleagues and to collectively build a team of passionate and talented HR professionals who continuously make me laugh, inspire me with their creativity, and challenge me to be a better leader is both a privilege and an honour. I am fortunate to work alongside them every day.

As members of Pythian’s HR team, we are business leaders. We care about (and contribute to) the success of Pythian. We solve the puzzles that can come with growing a business from 90 to 400 employees in five years. We deliver quality of service to our clients. We make a difference. We value our employees. We have fun.

To celebrate five incredible, challenging and rewarding years I share my Top 5’s.


Five Things I Appreciate Most About Pythian HR:

We are strategic.

We think global, not local.

We are productive, not busy.

We brainstorm without the “but”.

We are consistent, fair and we care.


Five Favorite Moments:

Each HR hire and expanding our HR team globally in 2014.

Celebrating our 2013 HR Initiative of the Year win with our world class HR team.

Reaching 200 employees in September 2012.

Celebrating our First Geek Day on May 25th 2011 and observing our global team spirit has grown.

Delivering our first BORG and Orientation session on January 4th 2011.


Five Favorite Programs We Have Built:

Love Our Community, the programs that supports our global employees and their communities.

SEED, the Self Directed training and professional development fund.

Delphic Journey, the Performance Feedback program.

Delphic Iris (the newsletter) and Newsflash (the weekly update).

Pythianology, the internal speaker series that showcases our talents, passions and interests.


Favorite Leadership Lessons in 10 Words or less:

Leadership is an art, not a science.

Lead the way you want to be led.

If you don’t lead by example, you are not leading.

Embrace Self Awareness, but focus your efforts on Self-Regulation.

Words matter. Less is more.


Five Favorite Books from the Last 5 Years:

You Are the Placebo: Making Your Mind Matter

The Power of Habit

When All You Have Is Hope

Turn The Ship Around

Speaking As A Leader


Steve Jobs once said that “Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work.  And the only way to do great work is to love what you do”.

Great work, amazing colleagues, and the opportunity to do what I love…thank you Pythian for a fantastic five years!

Categories: DBA Blogs

Enterprise Manager 12c R5 - Manage Hybrid Cloud from a Single Interface

Oracle recently announced Oracle Enterprise Manager 12c Release 5, which simplifies the journey to the cloud.  For the first time, with Release 5 ( you will be able to manage your...

We share our skills to maximize your revenue!
Categories: DBA Blogs