Blog > Big Data > Best of 2020 – Top 10 Data Integration and Data Quality Blog Posts

Best of 2020 – Top 10 Data Integration and Data Quality Blog Posts

Authors Photo Precisely Editor | December 16, 2020

Data integration helps to connect today’s infrastructure with tomorrow’s technology to unlock the potential of all your enterprise data while data quality helps you understand your data and ensure it’s accurate, consistent, and complete.

As 2020 ends, we wanted to share our most viewed Data Integration and Data Quality articles from the Precisely blog. Let’s count down the Top 10 blog posts of the year.

#10 Good Data Management=Happy Customers

Effective data management is more important than ever before, but it is also more difficult today than in the past. The volume of data that companies have to work with is increasing as cloud computing, mobile devices, the Internet of things (IoT), and other technologies grow and evolve. To complicate matters, much of the data that is critical to an organization may reside in external systems. Finally, data privacy and data sovereignty regulations are increasing in scope, as security concerns are greater than ever. Read more >

#9 Data Quality: A Financial Services Industry Imperative

Trust is the keystone of the financial system. Over the past two decades, global financial regulations have established a system of governance intended to ensure and safeguard that trust. Dodd-Frank, Basel III, MiFiD, EU Solvency II, and other major regulatory initiatives have mandated that financial institutions understand where their data is coming from and can attest to its accuracy and validity, as well as keeping it secure. Financial services data quality and security must be proactively maintained to comply with good data governance standards for which regulations are constantly evolving. Read more >

#8 Key Takeaways from Spark+AI Summit

The 2020 Spark+AI Summit 2020 organized by Databricks hosted 60,000 virtual attendees. The keynotes focused on technical updates such as data integration and quality, and included  multiple live demos, and use cases. Read more >

#7 3 Data Quality Capabilities That Supercharge Data Governance

With growing regulations and greater public concern over data privacy, companies are increasingly focused on data governance. Yet there’s no single way for a business to approach data governance that will ensure perfect results. Instead, companies can adopt a number of different strategies, all of which can be made more effective when bolstered by key data quality capabilities and processes. Read more >

#6 Top Challenges in Global Enterprise Data Management

Large enterprises (and even many small and midsize companies) are increasingly dealing with highly complex IT systems, often spread across multiple geographies and systems. In addition to core ERP, they are using specialized software to manage pricing, demand planning, customer relationship management, human resources, point-of-sale, and a multitude of other mission-critical functions within the business. Data management has become more and more challenging, as IT systems have grown to include specialized, domain-specific software. Read more >

eBook

Ensuring Trust and Quality in Big Data

Discover how a strong focus on data quality spanning the people, processes and technology of your organization will help ensure quality and trust in your analytics that drive business decisions.

#5 4 Steps Toward Better Data Management

In April 2019, digital magazine Raconteur predicted that by 2020, the digital universe would be 44 zettabytes. That means that there are 40 times more bytes than there are stars in the observable cosmos. That’s why putting a data management plan in place is so important.

For the enterprise, this statistic represents an opportunity as well as a risk; when information is managed properly, it’s valuable and helps you make better decisions. When information isn’t managed properly, it becomes more of a liability. Read more >

#4 Trillium Quality: Data Profiling and Quality with Efficiency and Agility

Efficiency and agility are key to drive valuable business outcomes with big data investments. Solving quality and consistency challenges at scale requires new approaches to discover, analyze, and govern assets in the data lake. Enterprises need to rethink data profiling and quality governance. Read more >

#3 How to Integrate Data from Legacy Systems into a Modern Data Environment

There are some very good reasons why many of the world’s most transaction-intensive companies are still running critical business applications on mainframes and using legacy data. Those legacy systems are highly reliable, secure, and scalable. In industries such as banking, insurance, and healthcare, these qualities are extremely important. Read more >

#2 Validation vs. Verification: What’s the Difference?

In layman’s terms, data verification and data validation may sound like they are the same thing. When you delve into the intricacies of data quality, however, these two important pieces of the puzzle are distinctly different. Knowing the distinction can help you to better understand the bigger picture of data quality. Read more >

#1 Best Practices for Modernizing Your Data Architecture

Can your current data architecture handle the massive influx of data that is coming into the enterprise every day? If not, it’s time to think about modernizing your data architecture to ensure you capture and manage one of the most valuable assets your organization has, its data. Read more >

To learn more about how to ensure data quality and integrity in the age of big data, download our eBook: Governing Volume: Ensuring Trust and Quality in Big Data