DB2 management, tutorials, scripts, coding, programming and tips for database administrators
Load tests give the database administrator (DBA) quite a lot of valuable information and may make the difference between poor and acceptable application performance. But what about a big data environment? Are there any gotchas or traps associated with big data?
With all of the news, articles, white papers and vendor products related to Big Data, it’s easy to forget the data that drives our companies, manages sales, interacts with customers, and supports our mission-critical systems--the "other" data ... the "little" data. If we want to incorporate big data into our enterprise the crucial step is integrating in with our existing data.
Ask database administrators how they implement disaster recovery in their big data environments and you'll get two typical responses: DR plans are not necessary and backups will take up a lot of space and recovery will take too long. Despite this reasoning, a disaster recovery plan for your big data implementation may be essential for your company's future.
Big Data is a new method, a new process, a new IT paradigm for storage and retrieval of data. Newness means change, and some staff may resist change. Mitigate these problems by planning ahead for staff training, user training, and collaboration across teams.
Using advanced analytics to analyze business data is common, especially in large companies with many customer-facing systems. As more and more data is made available the enterprise stages large data stores into the enterprise data warehouse. These Big Data implementations bring their own problems and issues, and will require database administrators and support staff to redesign the data warehouse architecture.
The advent of big data compounds the DBAs problems, as multiple distributed applications now require access to a very large data store. What tuning options are available?
Big data is the latest craze. Hardware and software vendors have overwhelmed IT departments with high-speed analytical software, proprietary high performance hardware, and columnar-based data stores promising quick access, promising lightning-fast answers to ad hoc analytical queries. Forgotten in this blast of technology are the database administrators' most important responsibilities: backup and recovery.
To meet the upcoming 'Peak Season' demands on IT systems, database administrators (DBAs) need to prepare the database and its supporting infrastructure for increased resource demands. Being proactive now can pay big dividends by maintaining service level agreements (SLAs), avoiding outages and resource shortages, and ensuring a positive overall customer experience.
Many IT enterprises are starting pilot projects to implement big data solutions. As a DBA, are you ready to support these efforts, and integrate them into your current architecture, processes, and standards?
Big Data is here, or will soon be here for the majority of IT enterprises. Database Administrators must now deal with large volumes of data and new forms of high-speed data analysis. If your responsibility includes performance tuning, here are the areas to focus on that will become more and more important in the age of Big Data.
Articles and advice on Big Data hardware and software solutions abound. The most popular topic is the role to be played by new analytics solutions such as analytical appliances, NoSQL database management systems (DBMSs) and Apache Hadoop software. How can the IT enterprise plan such implementations? What are the first steps?
The design requirements of an application usually determine the most effective database design and database administration support processes. If, however, the database administrator (DBA) is not present during the requirements definition process, sub-par performance can result.