DB2 management, tutorials, scripts, coding, programming and tips for database administrators
Big data applications do not need the same infrastructure support teams as do more common applications. As the enterprise embraces big data, management assumes that staff sizes will decrease. What should be done with those unneeded technologists? One answer: convert them into technology consultants that collaborate and coordinate with the lines of business. In other words, give them customer-facing roles.
Technical support teams usually support familiar hardware and software configurations. Specialization in particular combinations of operating systems and database management software is common, and this allows some team members to gain in-depth experience that is extremely valuable in an enterprise IT setting. How has big data changed this paradigm?
Big data applications are here to stay. The promise of this technology is the ability to quickly and easily analyze large amounts of data and derive from that analysis changes to customer-facing systems. Management believes that the analysis and subsequent changes will drive up customer satisfaction, market share and profits, hopefully at a reasonable cost.
Many big data application implementations seem to begin with an existing data warehouse, one or more new high-volume data streams, and some specialized hardware and software. The data storage issue is often accommodated by installing a proprietary hardware appliance that can store huge amounts of data while providing extremely fast data access. In these cases, do we really need to worry about database design?
Big data applications and their associated proprietary, high-performance data stores arrived on the scene a few years ago. With promises of incredibly fast queries, many IT shops implemented one or more of these combination hardware and software suites. However, few IT enterprises have implemented metrics that clearly measure the benefits of these systems. The expected monetary gains from big data applications have not yet materialized for many companies, due to inflated expectations. The solution: Measure resource usage, and use these measurements to develop quality metrics.
Load tests give the database administrator (DBA) quite a lot of valuable information and may make the difference between poor and acceptable application performance. But what about a big data environment? Are there any gotchas or traps associated with big data?
With all of the news, articles, white papers and vendor products related to Big Data, it’s easy to forget the data that drives our companies, manages sales, interacts with customers, and supports our mission-critical systems--the "other" data ... the "little" data. If we want to incorporate big data into our enterprise the crucial step is integrating in with our existing data.
Ask database administrators how they implement disaster recovery in their big data environments and you'll get two typical responses: DR plans are not necessary and backups will take up a lot of space and recovery will take too long. Despite this reasoning, a disaster recovery plan for your big data implementation may be essential for your company's future.
Big Data is a new method, a new process, a new IT paradigm for storage and retrieval of data. Newness means change, and some staff may resist change. Mitigate these problems by planning ahead for staff training, user training, and collaboration across teams.
Using advanced analytics to analyze business data is common, especially in large companies with many customer-facing systems. As more and more data is made available the enterprise stages large data stores into the enterprise data warehouse. These Big Data implementations bring their own problems and issues, and will require database administrators and support staff to redesign the data warehouse architecture.
The advent of big data compounds the DBAs problems, as multiple distributed applications now require access to a very large data store. What tuning options are available?
Big data is the latest craze. Hardware and software vendors have overwhelmed IT departments with high-speed analytical software, proprietary high performance hardware, and columnar-based data stores promising quick access, promising lightning-fast answers to ad hoc analytical queries. Forgotten in this blast of technology are the database administrators' most important responsibilities: backup and recovery.