Skip navigation.

Chris Foot

Syndicate content
Remote DBA Experts Blog
Updated: 3 hours 38 min ago

Point-of-sale developers find themselves targets for cyberattacks [VIDEO]

Mon, 2014-12-22 11:42

Transcript

Hi, welcome to RDX! Banks, retailers and other organizations that use point-of-sale and payment software developed by Charge Anywhere should take extra precautions to ensure their databases are protected.

The software developer recently announced that it sustained a breach that may have compromised data that was produced as far back as 2009. This event reaffirms cybersecurity experts’ assertions that cybercriminals are targeting companies that provide payment software, as opposed to simply attacking merchants.

While it’s up to Charge Anywhere and other such enterprises to patch any bugs in their software, those using these programs should ensure their point-of-sale databases containing payment card info are strictly monitored. Informing DBAs on how to better manage their access credentials is another necessary step.

Thanks for watching!

The post Point-of-sale developers find themselves targets for cyberattacks [VIDEO] appeared first on Remote DBA Experts.

The Database Protection Series Continues – Evaluating the Most Common Threats and Vulnerabilities – Part 1

Mon, 2014-12-22 10:12
Introduction

This is the second article of a series that focuses on securing your database data stores.  In the introduction, I provide an overview of the database protection process and what will be discussed in future installments.   Before we begin the activities required to secure our databases, we need to have a firm understanding of the most common database vulnerabilities and threats.

This two-part article is not intended to be an all-inclusive listing of database vulnerabilities and threat vectors.   We’ll take a look at some of the more popular tactics used by hackers as well as some common database vulnerabilities.   As we cover the topics, I’ll provide you with some helpful hints along the way to decrease your exposure.  The list is not in any particular order.   In future articles, I’ll refer back to this original listing from time-to-time to ensure that we continue to address them.

Separation of Duties (Or Lack Thereof)

Every major industry regulation is going to have separation of duties as a compliance objective.  If your organization complies with SOX, you should be well aware of the separation of duties requirements.  In order for my organization to satisfy our PCI compliance objectives, we need to constantly ensure that no single person, or group, is totally in control of a security function.    Since we don’t store or process PCI data, we focus mainly on securing the architecture.  So, I’ll use my organization’s compliance activities to provide two quick examples:

  • We assign the responsibility of security control design to an internal RDX team and security control review to a third-party auditing firm.
  • Personnel assigned the responsibility of administering our internal systems, which include customer access auditing components, do not have privileges to access our customer’s systems.

The intent is to prevent conflict of interest, intentional fraud, collusion or unintentional errors to increase the vulnerability of our systems.  For smaller organizations, this can be a challenge as it introduces additional complexity into the administration processes and can lead to an increase in staff requirements.   The key is review all support functions related to the security and access of your systems and prioritize them according to the vulnerability created by misuse.  Once the list is complete, you decompose the support activities into more granular activities and divide the responsibilities accordingly.

Unidentified Data Stores Containing Sensitive Data

It’s a pretty simple premise – you can’t protect a database that you don’t know about.  The larger the organization, the greater the chance is that sensitive data being is being stored and not protected.    Most major database manufacturers provide scanners that allow you to identify all of their product’s installations.  These are most often used during the dreaded licensing audits.    As part of our database security service offering, RDX uses McAfee’s Vulnerability Scanner to identify all databases installed on the client’s network.

Once you identify these “rogue” data stores, your next goal is to find out what’s in them.  This can be accomplished by asking the data owner to provide you with that information.  A better strategy is to purchase one of the numerous data analyzers available on the market.  The data analyzer executes sophisticated pattern matching algorithms to identify potentially sensitive data elements.   Because of the complex matching process that has to occur, there aren’t a lot of free offerings on the web you can take advantage of.   In our case, McAfee’s database scanner also includes the data identification feature.   It helps us to uncover the sensitive data elements that are hidden in our customers’ database data stores.

Clones from Production

Application developers have a particular affinity for wanting real world data to test with.  Can you blame them?    There’s a myriad of data variations they need to contend with.   Cloning live environments allows them to focus on writing and testing their code and less on the mindless generation of information that attempts to mimic production data stores.

Cloning creates a whole host of vulnerabilities.   The cloning process creates a duplicate of the production environment and it needs to be secured accordingly.   In addition, application developers shouldn’t have access to sensitive production data.  They’ll need access to the cloned systems to perform their work.   The first step is to identify the sensitive elements and then create a strategy to secure them.

Data masking, also known as data scrambling, allows administrators to effectively secure cloned data stores.   After the cloning process is performed, the administrator restricts access to the system until the data scrambling process is complete.  The key to a successful masking process is to replace the original data with as realistic of a replacement as possible.   Masking is not intended to be an alternative to encryption, its intent is to obscure the original values stored in the system.

There are several types of data masking offerings available.  Your first step should be to check your database product’s feature list.  Oracle, for example, provides a data masking feature.   There’s also a wealth of third-party products available to you.  If you have limited funding, search the internet for database masking and you’ll find lots of free alternatives.   Use a strong level of due diligence if you have to use a free alternative and check the data before and after scrambling – no matter which option you choose.

If your cloning process creates any files that contain data, most often used to transfer the database from the source to target, wipe them out after the cloning process is complete.  Lastly, perform an in-depth account security review.   Remove all accounts that aren’t needed for testing and only create the new ones needed to provide the required functionality to your application developers.  In part 2 of this article, we’ll discuss backup, output and load files.   You will also need to secure your cloned systems’ files accordingly.

Default, Blank and Weak Passwords

Years ago, after my career as an Oracle instructor, I became an industry consultant.  One of my company’s offerings was the database assessment.  I reviewed customers’ environments to ensure they were optimized for performance, availability and security.     Those were the days when breaches, and the resulting attention paid to security, were far less prominent than they are today.   I’d bring up a logon screen to the database and attempt to log in using the default passwords that were available in Oracle.   At that time there were about a dozen or so accounts automatically available after database creation.   I always loved the reaction of the client as they watched me successfully access their environment.  At each login, I’d say “this one gives me DBA”, “this one gives me access to all of your data tables”….   You can’t believe how many times I successfully logged in using sys/change_on_install.

Although Oracle, like most major database vendors, have ratcheted down on accounts, default and weak passwords are still a problem.   The database documentation will contain a listing of the accounts that are automatically created during installation.   Some advanced features will also require accounts to be activated.   After database creation and every time you install a new feature, do a quick scan of the documentation and then select from the users table to see if you have additional accounts to secure.

All major database vendors including Oracle, SQL Server, MySQL and DB2 provide password complexity mechanisms.   Some of them are automatically active when the database is created while others must be manually implemented.    In addition, most allow you to increase the complexity by altering the code.

Once your complexity strategy is underway, you’ll need to use a password vault to store your credentials.  I’m particularly fond of vaults that also provide a password generator.  The password vault’s feature list will be important.   When you perform your analysis of password vaults some of the more important features to focus on are:  how the password vault enforces its own security, logging and auditing features available, encryption at rest and during transfer, backup encryption, early-warning systems, dual launch key capabilities (takes 2 personnel to check out password), automatic notification when a password is checked out, how it handles separation of duties and if it can record the actions taken on the targets after the credentials are accessed.

Unencrypted Data At Rest

Database encryption, if administered correctly, provides a strong defense against data theft.  Most major database vendors provide data encryption as part of the product’s feature set.  Microsoft SQL Server and Oracle call theirs Transparent Data Encryption (TDE), IBM provides a few alternatives including Infosphere Guardium and MySQL provides a set of functions that perform data encryption.

You’ll also need to determine what data you want to encrypt as well as how you want to do it.  Most of the vendor offerings allow you to encrypt data at different levels including column, table, file and database.  Like most database features, you will need to balance security with processing overhead.  Most encryption alternatives will add enough overhead to impact database performance.  Identify the sensitive data elements and encrypt them.

Key management is crucial.   You can use all the encryption you want, but your environment will still be vulnerable if you don’t perform effective key management.    Here are a couple of RDX best practices for encryption key management:

  • Keep your encryption algorithms up to date. Vendors release updates for their encryption features on a fairly regular basis.   New database versions often contain significant security enhancements including new and improved encryption functionality.  There should be no debate with anyone in your organization.   If you are storing sensitive data, keep encryption functionality current.
  • DB2, Oracle, SQL Server and MySQL all have strong encryption features. If your database product doesn’t, you’ll need to rely upon on a robust, third-party product.   Encryption isn’t a feature that you want to skimp on or utilize a homegrown solution to provide.
  • Store the keys securely in a safe, centralized location. During our security analysis, we have seen a few shops store their keys in the same areas as the data they are encrypting.    Storing them in a centralized location allows you to lock that storage area down, provide separation of duties and activate access alerts.
  • Key rotation is the tech term for changing your encryption key values on a regular basis. Depending on the database, this can be a fairly complex process.  For other implementations, it’s fairly simple.    Complex or simple, come up with a plan to rotate your keys at least yearly.
Wrap-up

In part 2 of this article, we’ll cover unsecured data transmissions, securing input, output, report and backup files, SQL Injection and buffer overflow protection and a few other topics.  We’ll then continue our discussion on the process of securing our sensitive database data stores by outlining the key elements of a database security strategy.

The post The Database Protection Series Continues – Evaluating the Most Common Threats and Vulnerabilities – Part 1 appeared first on Remote DBA Experts.

Will the Grinch steal Linux’s Christmas?

Mon, 2014-12-22 02:00

For retailers, a vulnerability in their servers' operating systems could mean millions of dollars in losses, depending on how quick hackers are to react to newly discovered bugs. 

As Linux is affordable, efficient and versatile, many e-commerce and brick-and-mortar merchants use the OS as their go-to system, According to Alert Logic's Tyler Borland and Stephen Coty. The duo noted Linux also provides a solid platform on which e-commerce and point-of-sale software can run smoothly. 

The Grinch targeting Linux? 
Due to Linux's popularity among retailers, it's imperative they assess a vulnerability that was recently discovered – a bug that has been given the nickname "Grinch" by researchers. Dark Reading's Kelly Jackson Higgins noted the fault hasn't been labeled as an "imminent threat," but it's possible that some malicious actors would be able to leverage Grinch to escalate permissions on Linux machines and then install malware. 

Coty and Borland noted that Alert Logic's personnel discovered the bug, asserting that the bug exploits the "su" command, which enables a figure to masquerade as another user. The su command is part of the wheel user group. When a Linux solution is constructed, the default user is considered a member of the wheel group, providing them with administrative rights. 

"Anyone who goes with a default configuration of Linux is susceptible to this bug," he told Jackson Higgins. "We haven't seen any active attacks on it as of yet, and that is why we wanted to get it patched before people started exploiting it." 

Where the flaw lies 
Jackson Higgins maintained the Grinch is "living" in the Polkit, a.k.a. PolicyKit for Linux. Polkit is a privilege management system that allows administrators to assign authorizations to general users. Coty and Borland outlined the two main concepts experts should glean from Polkit:

  1. One of Polkit's uses lies in the ability to determine whether the program should initiate privileged operations for a user who requested the action to take place. 
  2. Polkit access and task permission tools can identify multiple active sessions and seats, the latter of which is described as an "untrusted user's reboot request."

"Each piece of this ecosystem exposes possible vulnerabilities through backend D-Bus implementation, the front end Polkit daemon, or even userland tools that use Polkit for privilege authorization." wrote Coty and Borland.

Despite these concerns, Coty informed Jackson Higgins that this vulnerability won't have to be patched until after the holiday season, and only inexperienced Linux users are likely to encounter serious problems. 

The post Will the Grinch steal Linux’s Christmas? appeared first on Remote DBA Experts.

FBI warns organizations of dangerous malware

Thu, 2014-12-18 14:40

Transcript

Hi, welcome to RDX! The Federal Bureau of Investigation recently sent a five-page document to businesses, warning them of a particularly destructive type of malware. It is believed the program was the same one used to infiltrate Sony's databases.

The FBI report detailed the malware's capabilities. Apparently, the software overrides all information on computer hard drives, including the master boot record. This could prevent servers from accessing critical software, such as operating systems or enterprise applications.

Database data can be lost or corrupted for many reasons. Regardless if the data loss was due to a hardware failure, human error or the deliberate act of a cybercriminal, database backups ensure that critical data can be quickly restored.  RDX's backup and recovery experts are able to design well-thought out strategies that help organizations protect their databases from any type of unfortunate event.

Thanks for watching!

The post FBI warns organizations of dangerous malware appeared first on Remote DBA Experts.

Oracle’s revises its database administrator accreditation portfolio

Thu, 2014-12-18 06:14

Enterprises looking for Oracle experts knowledgeable of the software giant's latest database solutions may discover some DBAs' certifications are no longer valid. 

In addition, companies using SAP applications would do well to hire DBAs who know how to optimize Oracle's server solutions. Many SAP programs leverage Oracle 12c databases as the underbelly of their functionality,so ensuring that SAP's software can use and secure data within these environments efficiently is a must. 

Releasing new accreditation standards 
TechTarget's Jessica Sirkin commented on Oracle's certification requirements, which now state that DBAs must take tests within one year in order to revamp their accreditations. The exams vet a person's familiarity with more recent versions of Oracle Database. Those with certifications in 7.3, 8, 8i and 9i must undergo tests to obtain accreditations in 10g, 11g or 12c to retain their active status within Oracle's CertView portal system. 

These rules apply to any professional holding a Certified Associate, Professional, Expert or Master credential in the aforementioned solutions. Those already possessing accreditation in 12c or 11g will be able to retain their active statuses for the foreseeable future, but DBAs with 10g certifications will be removed from the CertView list on March 1 of next year. 

As for the company's reasons, Oracle publicly stated that the measures are intended to "have qualified people implementing, maintaining and troubleshooting our software." 

SAP gets closer to Oracle 
Those who use Oracle's databases to power their SAP software are in luck. Earlier this year, SAP certified its solutions to coincide with the rollout of an updated version of Oracle 12c, specifically build 12.1.0.2. One of the reasons why SAP is supporting Oracle's flagship database is because the company wants to provide its customers with more flexible upgrade plans from 11g Release 2 to Oracle's latest release. SAP's supporting features will include the following:

  • A multitenancy option for 12c, which will allow users to view one piece of information or use one particular function simultaneously. 
  • Hybrid columnar compression technology, which will certainly help those who are trying to engineer back-end databases to store more information.

Most importantly, the news source acknowledged the fact that many businesses use Oracle's database products in conjunction with SAP's enterprise software. Incompatibility between the two has been a persistent headache for IT departments working with these two solutions, but SAP's move will improve the matter.

Overall, hiring a team of database experts experienced in working with different software running on top of Oracle is a safe bet for organizations wary of this change. 

The post Oracle’s revises its database administrator accreditation portfolio appeared first on Remote DBA Experts.

PostgreSQL isn’t just another database engine

Thu, 2014-12-18 06:13

While Oracle's database engine and Microsoft's SQL Server are among the top three server solutions among enterprises, by no means is PostgreSQL being left in the dust. 

This year, The PostgreSQL Global Development Group released PostgreSQL 9.4, which was equipped with several bug fixes as well as a few new capabilities, such as:

  • An ALTER SYSTEM command can be used to change configuration file entries
  • It's now possible to bring up materialized views to be refreshed without deflecting concurrent reads
  • Logical decoding for WAL data was added
  • Background worker processes can now be initiated, logged and terminated dynamically

PostgreSQL 9.4 is currently available for download on the developer's website. Users can access versions that are compatible with specific operating systems, including SUSE, Ubuntu, Solaris, Windows and Mac OS X. 

Rising to fame? 
ZDNet contributor Toby Wolpe spoke with EnterpriseDB chief architect Dave Page, who is also a member of PostgreSQL's core team, on the open source solution's popularity. He maintained that PostgreSQL's capabilities are catching up to Oracle's, an assertion that may not be shared by everyone but one worth acknowledging nonetheless. 

Page referenced PostgreSQL as "one of those well kept secrets that people just haven't cottoned on to." one of the reasons why PostgreSQL has gained so much traction lately is due to MySQL's purchase by Sun, which was acquired by Oracle. According to Page, people are apprehensive regarding Oracle's control of MySQL, another relational database engine. 

Features galore? 
Throughout his interview with Wolpe, Page noted a general sentiment among DBAs who are switching from MySQL to PostgreSQL for the latter solution's "feature-rich" content. In a sense, it makes sense that PostgreSQL would have plenty of functions to offer DBAs due to its open source format. When users can customize aspects of PostgreSQL that not only complement their workflow but that of other DBAs as well, such capabilities tend to be integrated into the next release permanently. 

One particular feature that has managed to stick in PostgreSQL is foreign data wrappers. Wolpe noted that data wrappers enable remote information to be categorized as a table within PostgreSQL, meaning queries can be run across both PostgreSQL tables and foreign data as if it were native. 

Another tool provides support for JSONB data, allowing information to be stored within PostgreSQL in a binary format. The advantage of this function is that it initiates a new index operator that's quick-acting. 

While PostgreSQL may not be the engine of choice for some DBAs, it is a solution worth acknowledging. 

The post PostgreSQL isn’t just another database engine appeared first on Remote DBA Experts.

Yes, SMBs should pay attention to disaster recovery

Thu, 2014-12-18 06:12

Effective disaster recovery plans admittedly take a lot of time, resources and attention to develop, which may cause some small and mid-sized businesses to stray away from the practice. While it's easy to think "it could never happen to me," that's certainly not a good mindset to possess. 

While the sole proprietor of a small graphic design operation may want to set up a disaster recovery plan, he or she may not know where to start. It's possible that the application used to create designs resides in-house, but the system used to deliver content to customers may be hosted through the cloud. It's a confusing situation, especially if one doesn't have experience in IT.

SMEs need DR, but don't have robust plans 
To understand how strong small and mid-sized enterprises' DR strategies are, Dimensional Research and Axcient conducted a survey of 453 IT professionals working at companies possessing between 50 and 1000 workers. The study found that 71 percent of respondents back up both information and software, but only 24 percent back up all their data and applications. Other notable discoveries are listed below:

  • A mere 7 percent of survey participants felt "very confident" that they could reboot operations within two hours of an incident occurring. 
  • More than half (53 percent) of respondents asserted company revenues would be lost until critical systems could be rebooted.  
  • Exactly 61 percent of SMBs use backup and recovery tools that perform the same functions. 
  • Almost three-fourths maintain that using multiple DR assets can increase the risk of endeavors failing. 
  • Eighty-nine percent surveyed view cloud-based DR strategies as incredibly desirable. The same percentage acknowledged that business workers are much less productive during outages. 

What measures can SMBs take? 
For many IT departments at SMEs, taking advantage of cloud-based DR plans can be incredibly advantageous. IT Business Edge's Kim Mays noted that decision-makers should pay close attention to the information and applications employees access most often to perform day-to-day tasks. Allowing these IT assets to transition to cloud infrastructures in the event of a disaster will allow workers to continue with their responsibilities. 

Of course, using a cloud-based strategy isn't the be-all, end-all to a solid DR blueprint. For instance, it's possible that personnel residing in affected areas may not have Internet access. This is where a business' senior management comes into play: Set guidelines that will allow staff to make decisions that will benefit the company during an outage. 

The post Yes, SMBs should pay attention to disaster recovery appeared first on Remote DBA Experts.

Linux users may need experts to reinforce malware detection functions

Thu, 2014-12-18 04:57

Enterprises using Linux operating systems to run servers or desktops may want to consider hiring specialists to prevent actions initiated by the "less" command. 

In addition, Linux users should also be aware that they have been targeted by a dangerous cyberespionage operation that is believed to be headquartered in Russia. If these two threats go unacknowledged, enterprises that use Linux may sustain grievous data breaches. 

A bug in the "less" command 
The vulnerability concerning less was detailed by Lucian Constantin, a contributor to Computerworld. Constantin noted that less presents itself as a "harmless" instruction that enables users to view the contents of files downloaded from the Web. However, using the less directive could also allow perpetrators to execute code remotely. 

Less is typically used to view information without having to load files into a computer's memory, a huge help for those simply browsing documents on the Internet. However, lesspipe is a script that automatically accesses third-party tools to process files with miscellaneous extensions such as .pdf, .gz, .xpi, and so on. 

One such tool, cpio file archiving, could enable a cybercriminal to initiate an arbitrary code execution exploit. Essentially, this would give him or her control over a machine, enabling them to manipulate it at will. This particularly bug was discovered by Michal Zalewski, a Google security engineer. 

"While it's a single bug in cpio, I have no doubt that many of the other lesspipe programs are equally problematic or worse," said Zalewski, as quoted by Constantin. 

Taking aim and firing 
The less command isn't the only thing Linux users should be concerned with. In a separate piece for PCWorld, Constantin noted that Russian cyberespionage group Epic Turla has directed its attention toward infiltrating machines running Linux.

Kaspersky Lab asserted Epic Turla is taking advantage of cd00r, an open-source backdoor program that was created in 2000. This particular tool enables users to initiate arbitrary directives, as well as "listen" to commands received via a transmission control protocol, or user datagram protocol – the perfect function that makes it a dangerous espionage asset. 

"It can't be discovered via netstat, a commonly used administrative tool," said Kaspersky researchers, as quoted by Constantin. "We suspect that this component was running for years at a victim site, but do not have concrete data to support that statement just yet." 

If Linux users want to secure their systems, consulting with specialists certified in the OS may not be a bad idea. 

The post Linux users may need experts to reinforce malware detection functions appeared first on Remote DBA Experts.

Data management challenges, concerns for health care companies

Thu, 2014-12-18 04:56

Volume and velocity are two words analysts are associating with health care data, motivating CIOs to assess the scalability and security of their current database infrastructures. 

Protecting the sensitive information contained within electronic health records has always been a concern, but the greatest issue at hand is that some health care providers don't have the personnel, assets or time required to effective manage and defend their databases. These concerns may incite mass adoption of outsourced database administration services

Greater volume at a faster rate 
CIO.com's Kenneth Corbin referenced a report conducted by EMC and research firm IDC, which discovered that the amount of health information is expected to increase 48 percent on an annual basis for the foreseeable future. In 2013, 153 exabytes of health care data existed. By 2020, that figure is anticipated to expand to 2,314 exabytes. 

EMC and IDC analysts proposed a scenario in which all of that information was stored on a stack of tablets. Referencing the 2020 statistic, they asserted that stack would be more than 82,000 miles high, reaching a third of the way to the moon. DC Health Insights Research Vice President Lynne Dunbrack maintained that health care companies can prepare for this explosion of data by identifying who owns the information and classifying it. 

"Understanding what the data means is key to making data governance and interoperability work, and is essential for analytics, big data initiatives and quality reporting initiatives, among other things," wrote Dunbrack in an email, as quoted by Corbin. 

More data means greater security concerns
As hospitals, insurance providers, clinics and other such organizations implement EHR software and increase their data storage capacities, it can be imagined that hackers will place the health care industry at the top of their list of targets. Health care records contain a plethora of valuable data, from Social Security numbers to checking account information. 

Health IT Security cited the problems Aventura Hospital and Medical Center in South Florida have encountered. Over the past two years, the institution has sustained three data breaches, one of which was caused by a vendor's employee who stole information on an estimated 82,000 patients. Worst of all, the worker was an employee of Valesco Ventures, Aventura's Health Insurance Portability and Accountability Act business associate. 

With this particular instance in mind, finding a database administration service with trustworthy employees is essential. In addition, contracting a company that can provide remote database monitoring 24/7/365 is a must – there can be no compromises. 

The post Data management challenges, concerns for health care companies appeared first on Remote DBA Experts.

Notable updates of SUSE Linux Enterprise 12

Fri, 2014-12-12 13:03

Transcript

Hi, welcome to RDX! Using SUSE Linux Enterprise Server to manage your workstations, servers and mainframes? SUSE recently released a few updates to the solution, dubbed Linux Enterprise Server 12, that professionals should take note of.

For one thing, SUSE addressed the problem with Unix’s GNU Bourne Again Shell, also known as the “Shellshock” bug. This is a key fix, as it disallows hackers from placing malicious code onto servers through remote computers.

As far as disaster recovery capabilities are concerned, Linux Enterprise Server 12 is equipped with snapshot and full-system rollback features. These two functions enable users to revert back to the original configuration of a system if it happens to fail.

Want a team of professionals that can help you capitalize on these updates? Look no further than RDX’s Linux team – thanks for watching!

The post Notable updates of SUSE Linux Enterprise 12 appeared first on Remote DBA Experts.

Preparing for the end of Windows Server 2003

Mon, 2014-12-01 07:24

Although support for Windows Server 2003 doesn't end until July of next year, enterprises that have used the operating system since its inception are transitioning to the solution's latest iteration, Windows Server 2012 R2.

Preliminary considerations
Before diving into the implications of transitioning from Server 2003 to Server 2012 R2, it's important to answer a valid question: Why not simply make the switch to Windows Server 2008 R2?

It's a conundrum that Windows IT Pro contributor Orin Thomas has ruminated on since the announcement of Microsoft's discontinuation of Server 2003. While he acknowledged various reasons why some professionals are hesitant to make the leap from Server 2003 to Server 2012 R2 (such as application compatibility issues and the "Windows 8-style interface") he pointed to a key concern: time.

Basically, Server 2008 R2 will cease to receive updates and support on Jan. 14, 2020. Comparatively, Server 2012 R2's end of life is slated for Jan. 10 2023.

In the event organizations have difficulty making the transition, there's always the option of seeking assistance from experts with certifications in Server 2012 R2. On top of migration and integration, these professionals can provide continued support throughout the duration of the solution's usage.

Key considerations
As companies using Windows Server 2003 will be moving to either Server 2008 R2 or Server 2012 R2, a number of implications must be taken into account. ZDNet contributor Ken Hess outlined several recommendations for those preparing for the migration:

  1. Identify how many Server 2003 systems you have in place.
  2. Aggregate and organize the hardware specifications for each system (CPU, memory, disk space, etc.).
  3. Assess how heavily these solutions were utilized over the years, then correlate them with projected growth and future workloads.
  4. Do away with systems that are no longer applicable to operations.
  5. Determine which applications running on top of Server 2003 are critical to the business model.
  6. Deduce how virtual machines can be leveraged to host underutilized processes.
  7. Collaborate with a database administration firm to outline and implement a migration plan (provide the partner with the data mentioned above).

These are just a few starting points on which to base a comprehensive migration plan. Also, it's important to be aware of unexpected spikes in server utilization. Although upsurges of 100 percent may occur infrequently, it's important that systems will be able to handle them effectively. As always, be sure to troubleshoot the renewed solution after implementation.

The post Preparing for the end of Windows Server 2003 appeared first on Remote DBA Experts.

Database active monitoring a strong defense against SQL injections

Mon, 2014-12-01 07:24

SQL injections have been named as the culprits of many database security woes, including the infamous Target breach that occurred at the commencement of last year's holiday season.

Content management system compromised
One particular solution was recently flagged as vulnerable to such hacking techniques. Chris Duckett, a contributor to ZDNet, referenced a public service announcement released by Drupal, a open source content management solution used to power millions of websites and applications.

The developer noted that, unless users patched their sites against SQL injection attacks before October 15, "you should proceed under the assumption that every Drupal 7 website was compromised." Drupal expanded by asserting that updating to 7.32 will patch the vulnerability, but websites that have already been exposed are still compromised – the reason being that hackers have already obtained back-end information.

There is one way in which websites that sustained attacks could have remained protected. Database monitoring, regardless of the system being used, can alert administrators of problems as they arise, giving them ample time to respond to breaches.

Why database monitoring works
Although access permissions, malware and other assets are designed to dismantle and eradicate intrusions, some of their detection features leave something to be desired. Therefore, in order for programs capable of deterring SQL injections to operate to the best of their ability, they must be programmed to work in conjunction with surveillance tools that assess all database actions constantly.

The Ponemon Institute polled 595 database experts on the matter, asking them about the effectiveness of server monitoring tools. While Chairman Larry Ponemon acknowledged the importance of using continuous monitoring to look for anomalous behavior, Secure Ideas CEO Kevin Johnson said some tools can miscalculate SQL injections because they're designed to appear legitimate. Therefore, it's important for surveillance programs to also be directed toward identifying vulnerabilities. Paul Henry, senior instructor at the SANS Institute, also weighed in on the matter. 

"I believe in a layered approach that perhaps should include a database firewall to mitigate the risk of SQL injection, combined with continuous monitoring of the database along with continuous monitoring of normalized network traffic flows," said Henry, as quoted by the source.

At the end of the day, having a team of professionals on standby to address SQL injections if and when they occur is the only way to guarantee that massive consequences don't exacerbate as a result of these attacks.

The post Database active monitoring a strong defense against SQL injections appeared first on Remote DBA Experts.

Securing Sensitive Database Data Stores

Mon, 2014-11-24 10:18

Introduction

Database administrators, since the inception of their job descriptions, have been responsible for the protection of their organization’s most sensitive database assets. They are tasked with ensuring that key data stores are safeguarded against any type of unauthorized data access.

Since I’ve been a database tech for 25 years now, this series of articles will focus on the database system and some of the actions we can take to secure database data. We won’t be spending time on the multitude of perimeter protections that security teams are required to focus on. Once those mechanisms are breached, the last line of defense for the database environments will be the protections the database administrator has put in place.

You will notice that I will often refer to the McAfee database security protection product set when I describe some of the activities that will need to be performed to protect your environments. If you are truly serious about protecting your database data, you’ll quickly find that partnering with a security vendor is an absolute requirement and not “something nice to have.”

I could go into an in-depth discussion on RDX’s vendor evaluation criteria, but the focus of this series of articles will be on database protection, not product selection. After an extensive database security product analysis, we felt that the breadth and depth of McAfee’s database security offering provided RDX with the most complete solution available.

This is serious business, and you are up against some extremely proficient opponents. To put it lightly, “they are one scary bunch.” Hackers can be classified as intelligent, inquisitive, patient, thorough, driven and more often than not, successful. This combination of traits makes database data protection a formidable challenge. If they target your systems, you will need every tool at your disposal to prevent their unwarranted intrusions.

Upcoming articles will focus on the following key processes involved in the protection of sensitive database data stores:

Evaluating the Most Common Threats and Vulnerabilities

In the first article of this series, I’ll provide a high level overview of the most common threat vectors. Some of the threats we will be discussing will include unpatched database software vulnerabilities, unsecured database backups, SQL Injection, data leaks and a lack of segregation of duties. The spectrum of tactics used by hackers could result in an entire series of articles dedicated to database threats. The scope of these articles is on database protection activities and not a detailed threat vector analysis.

Identifying Sensitive Data Stored in Your Environment

You can’t protect what you don’t know about. The larger your environment, the more susceptible you will be to data being stored that hasn’t been identified as being sensitive to your organization. In this article, I’ll focus on how RDX uses McAfee’s vulnerability scanning software to identify databases that contain sensitive data such as credit card or Social Security numbers stored in clear text. The remainder of the article will focus on identifying other objects that may contain sensitive, and unprotected data, such as test systems cloned from production, database backups, load input files, report output, etc…

Initial and Ongoing Vulnerability Analysis

Determining how the databases are currently configured from a security perspective is the next step to be performed. Their release and patch levels will be identified and compared to vendor security patch distributions. An analysis of how closely support teams adhere to industry and internal security best practices is evaluated at this stage. The types of vulnerabilities will range the spectrum, from weak and default passwords to unpatched (and often well known) database software weaknesses.

Ranking the vulnerabilities allows the highest priority issues to be addressed more quickly than their less important counterparts. After the vulnerabilities are addressed, the configuration is used as a template for future database implementations. Subsequent scans, run on a scheduled basis, will ensure that no new security vulnerabilities are introduced into the environment.

Database Data Breach Monitoring

Most traditional database auditing mechanisms are designed to report data access activities after they have occurred. There is no alerting mechanism. Auditing is activated, the data is collected and reports are generated that allow the various activities performed in the database to be analyzed for the collected time period.

Identifying a data breach after the fact is not database protection. It is database reporting. To protect databases we are tasked with safeguarding, we need a solution that has the ability to alert or alert and stop the unwarranted data accesses from occurring.

RDX found that McAfee’s Database Activity Monitoring product provides the real time protection we were looking for. McAfee’s product has the ability to identify, terminate and quarantine a user that violates a predefined set of database security policies.

To be effective, database breach protection must be configured as a stand-alone, and separated, architecture. Otherwise, internal support personnel could deactivate the breach protection service by mistake or deliberate intention. This separation of duties is an absolute requirement for most industry compliance regulations such as HIPAA, PCI DSS and SOX. The database must be protected from both internal and external threat vectors.

In an upcoming article of this series, we’ll learn more about real-time database activity monitoring and the benefits it provides to organizations that require a very high level of protection for their database data stores.

Ongoing Database Security Strategies

Once the database vulnerabilities have been identified and addressed, the challenge is to ensure that the internal support team’s future administrative activities do not introduce any additional security vulnerabilities into the environment.

In this article, I’ll prove recommendations on a set of robust, documented security controls and best practices that will assist you in your quest to safeguard your database data stores.

A documented plan to quickly address new database software vulnerabilities is essential to their protection. The hacker’s “golden window of zero day opportunity” exists from when the software’s weakness is identified until the security patch that addresses it is applied.

Separation of duties must also be considered. Are the same support teams that are responsible for your vulnerability scans, auditing and administering your database breach protection systems also accessing your sensitive database data stores?

Reliable controls that include support role separation and the generation of audit records that ensure proper segregation of duties so that even privileged users cannot bypass security will need to be implemented.

Wrap-up

Significant data breach announcements are publicized on a seemingly daily basis. External hackers and rogue employees continuously search for new ways to steal sensitive information. There is one component that is common to many thefts – the database data store. You need a plan to safeguard them. If not, your organization may be the next one that is highlighted on the evening news.

The post Securing Sensitive Database Data Stores appeared first on Remote DBA Experts.

Visualization shows hackers behind majority of data breaches

Wed, 2014-11-19 10:18

Transcript

Hi, welcome to RDX! Amid constant news of data breaches, ever wonder what's causing all of them? IBM and Ponemon's Global Breach Analysis can give you a rundown. 

While some could blame employee mishaps or poor security, hacking is the number one cause of many data breaches, most of which are massive in scale. For example, when Adobe was hacked, approximately 152 million records were compromised.

As you can imagine, databases were prime targets. When eBay lost 145 million records to perpetrators earlier this year, hackers used the login credentials of just a few employees and then targeted databases holding user information.

To prevent such trespasses from occurring, organizations should employ active database monitoring solutions that scrutinize login credentials to ensure the appropriate personnel gain entry.

Thanks for watching! Visit us next time for more news and tips about database protection!

The post Visualization shows hackers behind majority of data breaches appeared first on Remote DBA Experts.

Selecting a Data Warehouse Appliance [VIDEO]

Mon, 2014-11-10 11:57

Transcript

Hi, welcome to RDX! Selecting a data warehouse appliance is a very important decision to make. The amount of data that companies store is continuously increasing, and DBAs now have many data storage technologies available to them. Uninformed decisions may cause a number of problems including limited functionality, poor performance, lack of scalability, and complex administration.

Oracle, Microsoft, and IBM understand the common data warehousing challenges DBAs face and offer data warehouse appliances that help simplify administration and help DBAs effectively manage large amounts of data.

Need help determining which data warehouse technology is best for your business? Be sure to check out RDX VP of Technology, Chris Foot’s, recent blog post, Data Warehouse Appliance Offerings, where he provides more details about each vendor’s architecture and the benefits of each.

Thanks for watching. See you next time!
 

The post Selecting a Data Warehouse Appliance [VIDEO] appeared first on Remote DBA Experts.

Oracle assists professionals looking for recovery programs

Mon, 2014-11-10 01:40

Between natural disasters, cybercrime and basic human error, organizations are looking for tools that support disaster recovery endeavors, as well as the professionals capable of using them.

A fair number of administrators often use Oracle's advanced recovery features to help them ensure business continuity for database-driven applications. The database vendor recently unveiled a couple of new offerings that tightly integrate with their Oracle Database architecture.

Restoration and recovery
IT-Online acknowledged Oracle's Zero Data Loss Recovery Appliance, which is the first of its kind in regard to its ability to ensure critical Oracle databases retain their information even if the worst should occur. The source maintained that Oracle's new architecture can protect thousands of databases using a cloud-based, centralized recovery appliance as the target.

In other words, the Recovery Appliance isn't simply built to treat databases as information repositories that need to be backed up every so often.  The appliance's architecture replicates changes in real time to ensure that the recovery databases are constantly in sync with their production counterparts.  Listed below are several features that make the architecture stand out among conventional recovery offerings:

  • Live "Redo" data is continuously transported from the databases to the cloud-based appliance protecting the most recent transactions so that servers don't sustain data loss in the event of a catastrophic failure
  • To reduce the impact on the production environment, the Recovery Appliance architecture only delivers data that has been changed, which reduces server loads and network impact
  • The appliance's automatic archiving feature allows backups to be automatically stored on low cost tape storage
  • Data stored on the appliance can be used to recreate a historical version of the database

Simplifying database management and availability 
The second application that Oracle hopes will to enhance entire database infrastructures Oracle Database Appliance Management. The appliance manager application allows administrators to create rapid snapshots of both database and virtual machines, enabling them to quickly create and allocate development and test environments.

"With this update of Oracle Database Appliance software, customers can now reap the benefits of Oracle Database 12c, the latest release of the world's most popular database right out of the box," said Oracle Vice President of Product Strategy and Business Development Sohan DeMel. "We added support for rapid and space-efficient snapshots for creating test and development environments, organizations can further capitalize on the simplicity of Oracle engineered systems with speed and efficiency." 

The post Oracle assists professionals looking for recovery programs appeared first on Remote DBA Experts.

Dropbox Database Infiltrated by Hackers [VIDEO]

Fri, 2014-11-07 12:57

Transcript

While providing database security services to cloud storage providers is possible, many such companies aren't taking the necessary precautions to ensure customer data remains protected. 

According to 9 to 5 Mac, Dropbox recently announced that a database holding 7 million logins for its users was infiltrated. The environment was operated by a third party, which was hired by Dropbox to store its customer data. To the company's relief, many of the stolen passwords were outdated. 

While Dropbox is taking steps to mitigate the situation, the enterprise advised its customers to change their login information as an extra precaution. 

The best way to prevent breaches from occurring is to install automated intrusion detection software. In addition, regularly auditing existing systems for vulnerabilities is considered a best practice. 

Thanks for watching! 

The post Dropbox Database Infiltrated by Hackers [VIDEO] appeared first on Remote DBA Experts.

RDX Services: Optimization [VIDEO]

Mon, 2014-11-03 15:57

Transcript

Hi, welcome to RDX. When searching for a database administration service, it's important to look for a company that prioritizes performance, security and availability.

How does RDX deliver such a service? First, we assess all vulnerabilities and drawbacks that are preventing your environments from operating efficiently. Second, we make any applicable changes that will ensure your business software is running optimally. From there, we regularly conduct quality assurance audits to prevent any performance discrepancies from arising. 

In addition, we offer 24/7 support for every day of the year. We recognize that systems need to remain online on a continuous basis, and we're committed to making sure they remain accessible. 

Thanks for watching!

The post RDX Services: Optimization [VIDEO] appeared first on Remote DBA Experts.

Data Warehouse Appliance Offerings

Fri, 2014-10-31 11:15

Introduction

Information Technology units will continue to be challenged by the unbridled growth of their organization’s data stores. An ever-increasing amount of data needs to be extracted, cleansed, analyzed and presented to the end user community. Data volumes that were unheard of a year ago are now commonplace. Day-to-day operational systems are now storing such large amounts of data that they rival data warehouses in disk storage and administrative complexity. New trends, products, and strategies, guaranteed by vendors and industry pundits to solve large data store challenges, are unveiled on a seemingly endless basis.

Choosing the Large Data Store Ecosystem

Choosing the correct large data store ecosystem (server, storage architecture, OS, database) is critical to the success of any application that is required to store and process large volumes of data. This decision was simple when the number of alternatives available was limited. With the seemingly endless array of architectures available, that choice is no longer as clear cut. Database administrators now have more choices available to them than ever before. In order to correctly design and implement the most appropriate architecture for their organization, DBAs must evaluate and compare large data store ecosystems and not the individual products.

Traditional Large Data Store Technologies

Before we begin our discussion on the various advanced vendor offerings, we need to review the database features that are the foundation of the customized architectures we will be discussing later in this article. It is important to note that although each vendor offering certainly leverages the latest technologies available, the traditional data storage and processing features that DBAs have been utilizing for years remain critical components of the newer architectures.

Partitioning

Partitioning data into smaller disk storage subsets allows the data to be viewed as a single entity while overcoming many of the challenges associated with the management of large data objects stored on disk.

Major database vendor products offer optimizers that are partition aware and will create query plans that access only those partitioned objects needed to satisfy the query’s data request (partition pruning). This feature allows administrators to create large data stores and still provide fast access to the data.

Partitioning allows applications to take advantage of “rolling window” data operations. Rolling windows allow administrators to roll off what is no longer needed. For example, a DBA may roll off the data in the data store containing last July’s data as they add this year’s data for July. If the data is ever needed again, administrators are able to pull the data from auxiliary or offline storage devices and plug the data back into the database.

Query Parallelism

Query parallelism improves data access performance by splitting work among multiple CPUs. Most database optimizers are also parallel aware and are able to break up a single query into sub queries that access the data simultaneously.

Without parallelism, a SQL statement’s work is performed by a single process. Parallel processing allows multiple processes to work together to simultaneously process a single SQL statement or utility execution. By dividing the work necessary to process a statement among multiple CPUs, the database can execute the statement more quickly than if the work was single-threaded.

The parallel query option can dramatically improve performance for data-intensive operations associated with decision support applications or very large database environments. Symmetric multiprocessing (SMP), clustered, or massively parallel systems gain the largest performance benefits from the parallel query option because query processing can be effectively split up among many CPUs on a single system.

Advanced Hardware and Software Technologies

Let’s continue our discussion by taking a high-level look at the advanced data warehouse offerings from the three major database competitors, Oracle, Microsoft and IBM. Each of the vendors’ offerings are proprietary data warehouse ecosystems, often called appliances, that consist of hardware, OS and database components. We’ll complete our review by learning more about Hadoop’s features and benefits.

Oracle Exadata

Oracle’s Exadata Machine combines their Oracle database with intelligent data storage servers to deliver very high performance benchmarks for large data store applications. Exadata is a purpose-built warehouse ecosystem consisting of hardware, operating system and database components.

Oracle Exadata Storage Servers leverage high speed interconnects, data compression and intelligent filtering and caching features to increase data transfer performance between the database server and intelligent storage servers. In addition, the Exadata architecture is able to offload data intensive SQL statements to the storage servers to filter the results before the relevant data is returned to the database server for final processing.

Exadata uses PCI flash technology rather than flash disks. Oracle places the flash memory directly on the high speed PCI bus rather than behind slow disk controllers and directors. Each Exadata Storage Server includes 4 PCI flash cards that combine for a total of 3.2 TB of flash memory. Although the PCI flash can be utilized as traditional flash disk storage, it provides better performance when it is configured as a flash cache that sits in front of the disks. Exadata’s Smart Flash Cache will automatically cache frequently accessed data in the PCI cache, much like its traditional database buffer cache counterpart. Less popular data will continue to remain on disk. Data being sent to the PCI Flash cache is also compressed to increase storage capacity.

Exadata also offers an advanced compression feature called Hybrid Columnar Compression (HCC) to reduce storage requirements for large databases. Exadata offloads the compression/decompression workload to the processors contained in the Exadata storage servers.

These technologies enable Exadata to deliver high performance for large data stores accessed by both decision support and online operational systems. The Exadata machine runs an Oracle database which allows Oracle-based applications to be easily migrated. Oracle describes the Exadata architecture as “scale out” capable meaning multiple Exadata servers can be lashed together to increase computing and data access horsepower . Oracle RAC, as well as Oracle’s Automatic Storage Management (ASM), can be leveraged to dynamically add more processing power and disk storage.

Microsoft SQL Server PDW

SQL Server Parallel Data Warehouse (PDW) is a massively parallel processing (MPP) data warehousing appliance designed to support very large data stores. Like Oracle’s Exadata implementation, the PDW appliance’s components consist of the entire database ecosystem including hardware, operating system and database.

Database MPP architectures use a “shared-nothing” architecture, where there are multiple physical servers (nodes), with each node running an instance of the database and having its own dedicated CPU, memory and storage.

Microsoft PDW’s architecture consists of:

  • The MPP Engine
    • Responsible for generating parallel query execution plans and coordinating the workloads across the system’s compute nodes
    • Uses a SQL Server database to store metadata and configuration data for all of the databases in the architecture
    • In essence, it acts as the traffic cop and the “brain” of the PDW system
  • Computer Nodes
    • Each compute node also runs an instance of the SQL Server database
    • The compute nodes’ databases are responsible for managing the user data

As T-SQL is executed in the PDW system, the queries are broken up to run simultaneously over multiple physical nodes, which utilizes parallel execution to provide high performance data access. The key to the success when using PDW is to select the appropriate distribution columns that are used to intelligently distribute the data amongst the nodes. The ideal distribution column is one that is accessed frequently, is able to evenly distribute data based on the column’s values and has low volatility (doesn’t change a lot).

Microsoft Analytics Platform (APS)- Hadoop and SQL Server Integration

Microsoft’s Analytics Platform System (APS) combines massively parallel processing offering (PDW) with HDInsight, their version of Apache Hadoop. Microsoft has partnered with Hortonworks, a commercial Hadoop software vendor that provides a Windows-based, 100% Apache Hadoop distribution. Please see section below for more detailed information on the Hadoop engine.

Integrating a Hadoop engine into SQL Server allows Microsoft to capture, store, process and present both structured (relational) and unstructured (non-relational) data within the same logical framework. Organizations wanting to process unstructured data often turned to Apache Hadoop environments which required them to learn new data storage technologies, languages and an entirely new processing architecture.

Microsoft’s Polybase provides APS users with the ability to query both structured and non-structured data with a single T-SQL based query. APS application programmers are not required to learn MapReduce or HiveQL to access data stored in the APS platform. Organizations using APS do not incur the additional costs associated with to re-training their existing staff or hiring personnel with experience in Hadoop access methods.

IBM PureData Systems for Analytics

Not to be outdone by their database rivals, IBM also provides a proprietary appliance called IBM PureData System for Analytics. The system, powered by Netezza, once again, combines the hardware, database and storage into a single platform offering. Pure Data Analytics is an MPP system utilizing IBM Blade Servers and dedicated disk storage servers that, like its competitors, is able to intelligently distribute workloads amongst the processing nodes.

IBM leverages field-programmable gate arrays (FPGAs) which are used in their FAST engines. IBM runs the FAST engine on each node to provide compression, data filtering and ACID compliance on the Netezza systems. The real benefit of FAST is that the FPGA technology allows the engines to be custom tailored to the instructions that are being sent to them for processing. The compiler divides the query plan into executable code segments, called snippets, which are sent in parallel to the Snippet Processors for execution. The FAST engine is able to customize the filtering according to the snippet being processed.

IBM’s Cognos, Data Stage, and InfoSphere Big Insights software products are included in the offering. IBM’s goal is to provide a total warehouse solution, from ETL to final data presentation, to Pure Data Analytics users.

In addition, IBM also provides industry-specific warehouse offerings for banking, healthcare, insurance, retail and telecommunications verticals. IBM’s “industry models” are designed to reduce the time and effort needed to design data warehousing systems for the organizations in these selected business sectors. IBM provides the data warehouse design and analysis templates to accelerate the data warehouse build process. IBM consulting assists the customer to tailor the architecture to their organization’s unique business needs.

Non-Database Vendor Technologies

New “disruptive” products that compete with the traditional database vendor offerings continue to capture the market’s attention. The products range the spectrum from No-SQL products that provide easy access to unstructured data to entirely new architectures like Apache’s Hadoop.

Major database vendors will make every effort to ensure that disruptive technologies gaining market traction become an enhancement, not a replacement, for their traditional database offerings. Microsoft’s APS platform is an excellent example of this approach.

Apache Hadoop

Apache’s Hadoop is a software framework that supports data-intensive distributed applications under a free license. The Hadoop software clusters’ commodity servers offer scalable and affordable large-data storage and distributed processing features in a single architecture.

A Hadoop cluster consists of a single master and multiple worker nodes. The master provides job control and scheduling services to the worker nodes. Worker nodes provide storage and computing services. The architecture is distributed, in that the nodes do not share memory or disk.

A distributed architecture allows computing horsepower and storage capacity to be added without disrupting on-going operations. Hadoop’s controlling programs keep track of the data located on the distributed servers. In addition, Hadoop provides multiple copies of the data to ensure data accessibility and fault tolerance.

Hadoop connects seamlessly to every major RDBMS through open-standard connectors providing developers and analysts with transparent access through tools they are familiar with. When used simply as a large-data storage location, it is accessible through a variety of standards-based methods such as FUSE or HTTP. Hadoop also offers an integrated stack of analytical tools, file system management and administration software to allow for native exploration and mining of data.

The Hadoop architecture is able to efficiently distribute processing workloads amongst dozens and hundreds of cost-effective worker nodes. This capability dramatically improves the performance of applications accessing large data stores. Hadoop support professionals view hundreds of gigabytes as small data stores and regularly build Hadoop architectures that access terabytes and petabytes of structured and unstructured data.

One of Hadoop’s biggest advantages is speed. Hadoop is able to generate reports in a fraction of the time required by traditional database processing engines. The reductions can be measured by orders of magnitude. Because of this access speed, Hadoop is quickly gaining acceptance in the IT community as a leading alternative to traditional database systems when large data store technologies are being evaluated.

Wrap-up

As stated previously, there is an endless array of offerings that focus on addressing large data store challenges. Large data store architecture selection is the most important decision that is made during the warehouse development project. A correctly chosen architecture will allow the application to perform to expectations, have the desired functionality and be easily monitored and administered. Incorrect architecture decisions may cause one or more of the following problems to occur: poor performance, limited functionality, high total cost of ownership, complex administration and tuning, lack of scalability, poor vendor support, poor reliability/availability and so on. All market- leading database vendors understand the importance of addressing the challenges inherent with large data stores and have released new products and product enhancements designed to simplify administration and improve performance.

The post Data Warehouse Appliance Offerings appeared first on Remote DBA Experts.

Open Source Virtualization Project at Risk [VIDEO]

Fri, 2014-10-31 04:02

Transcript

Hi, welcome to RDX. Virtualization and cloud technology pretty much go hand-in-hand.

Many popular cloud providers, such as Amazon and Rackspace, use Xen, an open-source virtualization platform to optimize their environments.

According to Ars Technica, those behind the Xen Project recently released a warning to those using its platform. Apparently, a flaw within the program's hypervisor allows cybercriminals to corrupt a Xen virtual machine, or VM. From there, perpetrators could read information stored on the VM, or cause the server hosting it to crash. Monitoring databases hosted on Xen VMs is just one necessary step companies should take. Reevaluating access permissions and reinforcing encryption should also be priorities. 

Thanks for watching! Be sure to visit us next time for any other advice on security vulnerabilities. 

The post Open Source Virtualization Project at Risk [VIDEO] appeared first on Remote DBA Experts.