DB2 management, tutorials, scripts, coding, programming and tips for database administrators
The IT enterprise has grown, databases are bigger, companies have more product lines, more services, more customers and more transactions. Business analysts used to be satisfied with subsets of only certain data elements for their queries, and now they want it all. Now they want more....
As big data solutions grow in size and complexity, concerns can rise about future performance. One way to assess the potential effects is to measure your big data application’s health.
Until recently, business analytics against big data and the enterprise data warehouse had to come from sophisticated software packages. This was because many statistical functions such as medians and quartiles were not available in basic SQL, forcing the packages to retrieve large result sets and perform aggregations and statistics locally. Today, many database management systems have incorporated these functions into SQL, including IBM's flagship product, Db2.
A new separately-priced software offering from IBM on z/OS systems uses machine learning and artificial intelligence to assist the Db2 optimizer in choosing high-performance access paths for SQL statements.
Db2 Version 12 includes many new and improved features specifically aimed at improving application performance. As today's development teams are driven to implement applications at a faster pace, the DBMS must support the ability to retrieve data quickly, while at the same time reducing overall resource usage. Here is a detailed look at some modern transactional data processing issues and how Db2 meets these challenges.
IBM’s Db2 Version 12 for z/OS was designed to synergize with new IBM z14 hardware, which includes several new and updated options for hardware-assisted data encryption and compression. These features can be used by the database management system to store and retrieve encrypted and compressed data transparently without application knowledge or intervention. Read on to learn more.
IBM now provides an option to configure its Db2 version 12 for z/OS and complementary IBM Db2 Analytics Accelerator (IDAA) to permit concurrent transactional processing of operational data with analytics processing of data in the appliance. This new feature, zero-latency HTAP (hybrid transactional analytical processing) provides a patented replication process that propagates native Db2 table changes to the IDAA data store. This then allows BI queries to act on up-to-date data.
The General Data Protection Regulation (GDPR) went into effect worldwide on May 25, 2018. In response, companies throughout the world increased their data security awareness, appointed data protection officers and updated their privacy policies. IT support staff responded with updated data dictionaries, flagging of personal data, encryption at various points (local and cloud storage, network traffic, etc.) and heightened security procedures. However, more work is needed. In this article we focus on what the DBA must do in the near term in order to anticipate and prevent performance and capacity issues.
IBM's Db2 Version 12 for z/OS has many new features that focus on performance and security measures, particularly for mobile and cloud applications. Advances in cryptographic hardware and query accelerator technologies facilitate rapid development of customer-facing applications and the embedding of big data queries in operational systems.
IBM presents version 12 of its flagship database product Db2 as, "The ultimate enterprise database for business-critical transactions and analytics". What does this mean? In this slideshow we dive into the most critical changes in this version of Db2, including improvements for application enablement, DBA functions, on-line transaction performance and query performance.
What elements comprise current day disaster recovery planning, and how are large mainframe and big data applications recovered?
There has always been a need for some IT users to access data from multiple operational systems across the enterprise. In a process called data integration, companies developed a centralized IT solution that presented data from across the enterprise in a single application. However, with the advent of big data applications, there is now too much data across the enterprise to transport and store in a single place. Data integration had to be re-defined in order to access separate data stores in-place with available access in real-time.