Featured Database Articles
In order to help DBAs determine which tables and stored procedures might be good to take advantage of using the new In-Memory OLTP tables in SQL Server 2014, Microsoft introduced the AMR Utility. Greg Larsen explores how to use this tool.
Continuing a discussion focused on implementing data services in Windows Azure, we turn our attention to the remediation of incompatibilities, resulting from of limitations inherent to Platform as a Service (PaaS) based deployments, that need to be addressed as part of the migration process.
Last month's article introduced the In-Memory table or what Microsoft refers to as In-Memory OLTP, or Hekaton. This new type of table is available with SQL Server 2014. That article demonstrated the basics of creating an In-Memory table. This month we'll focus on the different types of indexes you can place on your In-Memory tables, and how those indexes support different search criteria.
Compression can be an excellent tool to save database storage, but you need to be aware that the compression levels can change for updated tables when running Exadata and using any of the Hybrid Columnar Compression (HCC) compression types. Read on to learn more.
Partitioned tables continue to be used in many systems as performance enhancing options on large tables, and with the new features that have been added, Oracle Database 12c has taken some interesting steps to make managing them more effective and efficient.
Occasionally a developer believes he or she is better at enforcing referential integrity than the database. Unfortunately referential integrity isn't transactional, so these attempts fail. Read on to see how the situation can be improved by using the built-in features of Oracle.
Using advanced analytics to analyze business data is common, especially in large companies with many customer-facing systems. As more and more data is made available the enterprise stages large data stores into the enterprise data warehouse. These Big Data implementations bring their own problems and issues, and will require database administrators and support staff to redesign the data warehouse architecture.
The advent of big data compounds the DBAs problems, as multiple distributed applications now require access to a very large data store. What tuning options are available?
Big data is the latest craze. Hardware and software vendors have overwhelmed IT departments with high-speed analytical software, proprietary high performance hardware, and columnar-based data stores promising quick access, promising lightning-fast answers to ad hoc analytical queries. Forgotten in this blast of technology are the database administrators' most important responsibilities: backup and recovery.