Featured Database Articles
Greg Larsen discusses the BUCKET_COUNT setting of a HASH index and how to determine how well SQL Server distributes the In-Memory table rows across multiple buckets of a HASH index, by exploring a new DMV, provided with SQL Server 2014, along with the In-Memory OLTP table functionality.
The Change Data Capture feature of SQL Server captures DML changes happening on a tracked table. Arshad Ali demonstrates how this feature can be leveraged.
Recently we presented a procedure for uploading data to a Windows Azure SQL Database from an on-premise SQL Server instance. Today we will accomplish the same objective by employing a more traditional approach to data extraction, transformation, and loading (ETL) that relies on SQL Server Integration Services (SSIS).
For many of us, setting up an Oracle standby database has become fairly old hat. Just remember, keep everything on the standby server exactly the same as the primary, and everything will go fine. But what if you want your standby on the same server as the primary database? And why on earth would you want to do that anyway? Isn’t the point Disaster Recovery?
New index access paths in Oracle 11g and later releases can use existing multi-column indexes even when the column you're looking for isn't the leading column. Read on to see how Oracle accomplishes this feat.
Using RMAN to clone a database is old hat to a lot of folks. But for some, it can be a daunting task. Here's a refresher on cloning using RMAN.
With all of the news, articles, white papers and vendor products related to Big Data, it’s easy to forget the data that drives our companies, manages sales, interacts with customers, and supports our mission-critical systems--the "other" data ... the "little" data. If we want to incorporate big data into our enterprise the crucial step is integrating in with our existing data.
Ask database administrators how they implement disaster recovery in their big data environments and you'll get two typical responses: DR plans are not necessary and backups will take up a lot of space and recovery will take too long. Despite this reasoning, a disaster recovery plan for your big data implementation may be essential for your company's future.
Big Data is a new method, a new process, a new IT paradigm for storage and retrieval of data. Newness means change, and some staff may resist change. Mitigate these problems by planning ahead for staff training, user training, and collaboration across teams.