Vizify Analytics

Posts

Data Analytics

How to stay ahead of Tableau’s latest features: Your guide to key community forums and events

In today’s fast-paced digital landscape, staying current with technology is not just beneficial—it’s essential. The evolution of many software platforms is accelerating, driven in large part by the growing influence of artificial intelligence. From personalised insights to automated analytics, AI is not only transforming how we use tools like Tableau but also how quickly new features and capabilities are being developed and released. This rapid pace of innovation can be both exciting and overwhelming. For data professionals, business analysts, and Tableau users of all levels, keeping up with these changes is critical to maintaining relevance, unlocking value from data, realising their investments in licenses, and driving informed decision-making in their organisations. But how do you stay ahead when the software you’re using today might evolve significantly in just a few months? Fortunately, Tableau has built a wealth of resources and community forums to help users stay informed, skilled, and connected, staying true to their #datafam which has been built over several years and initiatives. Whether you’re looking to preview new releases, learn best practices, or engage with other professionals in your field, Tableau offers a wide range of options to keep your knowledge up to date. Here’s a rundown of the most valuable forums and events for staying current with Tableau’s latest features: 1. Tableau Conference (TC) The Tableau Conference is the cornerstone event for anyone serious about Tableau. Hosted annually, this global gathering brings together thousands of data enthusiasts, analysts, and developers to explore the latest Tableau innovations. It’s where new features are often announced and demonstrated live by the Tableau product team with ‘Devs on Stage’. In addition to product keynotes, the conference offers breakout sessions, hands-on training, and community networking. Whether you attend in person or join virtually, it’s one of the most efficient and engaging ways to absorb what’s new and what’s next. 🔗 Catch up on TC25 here: https://www.salesforce.com/tableau-conference/ 2. Tableau on Tour If you can’t make it to the main conference, Tableau on Tour brings the energy and insights of the flagship event to cities around the world, including the UK. These regional events are tailored to local audiences and often feature customer stories, product demos, and Tableau experts sharing practical tips. This format is perfect for users who want access to high-quality learning and networking without the need for long-distance travel. 🔗 Explore upcoming events: https://www.tableau.com/en-gb/community/events 3. Tableau User Groups (TUGs) Tableau User Groups are local and virtual communities where Tableau users connect regularly to share ideas, demos, and use cases. These sessions are often informal, making them a great place for asking questions, solving challenges, and learning from peers. TUGs are typically organised by region, industry, or area of interest—such as Healthcare TUGs or Women in Data or the London TUG (our local TUG). They’re one of the best ways to build a network and continuously grow your Tableau knowledge over time – they also give you a great opportunity to deliver short presentations to small groups of like-minded enthusiasts, should you wish to. 🔗 Join the London TUG: https://usergroups.tableau.com/london-tableau-user-group/ 4. Tableau Community Slack For more real-time interaction, the Tableau Community Slack is a dynamic space where thousands of users—from beginners to Tableau Ambassadors—engage in daily discussions. You can ask technical questions, share dashboards, explore new features, or just stay informed about the latest happenings in the Tableau ecosystem. It’s also a great way to discover quick tips, hear about upcoming events, and connect with Tableau employees and partners. 🔗 Join the Slack community: https://www.tableau.com/community/slack 5. Inside Track Series Looking for a deeper dive into Tableau’s roadmap? The Inside Track Series is a monthly event designed to provide deeper insights into Tableau’s capabilities and roadmap. Hosted by Tableau product experts, these sessions offer technical walkthroughs, live demos, and detailed explanations of new features—often before they become widely available. Perfect for those who want to understand not just what’s new, but why it matters. 6. Release notes Tableau Release notes are a great way to see what features have been released in which versions – allowing you to build a case to download the latest and greatest Tableau. Clearly laid out in categories with descriptions, the Tableau release notes site provides an overview of the key features along with detailed updates, feature releases and hot fixes. 🔗 Feature summary: https://www.tableau.com/en-gb/products/all-features🔗 Detailed release notes: https://www.tableau.com/en-gb/support/releases Staying current matters—across every industry Whether you’re working in manufacturing, healthcare, retail, infrastructure, or any other data-driven sector, staying ahead of the curve is more important than ever. In a world where AI is accelerating the pace of software innovation, Tableau users need to be proactive in their learning journey. Fortunately, Tableau has fostered one of the most vibrant and supportive communities in the analytics space. By tapping into the resources outlined above, you won’t just stay current with new features—you’ll gain practical insights that help you apply them more effectively in your day-to-day work. At Vizify, we help organisations across the UK and beyond turn complex data into clear, actionable intelligence. Our consultants are experts in building intuitive, enterprise-grade dashboards that not only visualise data effectively but also drive meaningful outcomes. Let’s build smarter Tableau dashboards together Whether you’re just getting started with Tableau or looking to scale your analytics capabilities, we’re here to help. From implementation and training to dashboard design and automation, Vizify is your trusted partner in transforming data into decisions. 📩 Get in touch to explore how we can support your data journey.🔗 Learn more about our data analytics solutions. Get industry insights and expert tips straight to your inbox

Data Applications

Power Apps in action: Turning simple ideas into efficient solutions

From a simple team request to a data collection app! Even the smallest operational tasks present an opportunity for innovation. When our team needed a quick way to collect food and drink preferences for an upcoming social event, I saw more than just a logistical challenge, I saw a chance to demonstrate the power of low-code solutions in action. Instead of flooding everyone’s inbox with emails and chasing down responses, I saw an opportunity to create something more engaging and efficient. As a data consultant, I couldn’t resist the chance to build a custom application. What started as an administrative task quickly turned into a an application built in Power Apps—no heavy coding, no complicated tools, just pure Power Apps magic—delivered with zero code, in record time. The challenge: gather staff preferences, but make it easy The task seemed simple: collect food, drink, and dietary preferences from the team for an event. But the real challenge was to: It had to be quick, easy, and, of course, a little fun. The solution: a no-code approach with Power Apps Step 1: Create the app Power Apps is a collection of low-code tools that allows you to build custom applications—usually for business purposes, but also to build automation pipelines, and analyse data. To build the app, I followed three simple steps: For simplicity, I connected my app to an Excel file. While you can create a data-driven application that does not rely on a database, the real value lies in connecting it to pre-existing data on a database. However, for this scenario, we continued without a database link. Step 2: Raw, high-quality dataWith the App shared with the team, all responses were available immediately in the linked Excel file. Due to our structured way of collecting responses, matched with our validation processes, we were guaranteed high quality data. No clean up tasks, or feedback loops with respondents, just high-quality structured data. Step 3: The outcome—data collected; event sorted!With clean data comes insights and reports that can be trusted. I used the responses from the App to build a shopping list that provided refreshments for everyone throughout the social event – no one left hungry. Beyond Power Apps: The bigger picture of data applications While Power Apps was the perfect no-code tool for this quick-win scenario, businesses with more complex data needs might require additional flexibility, customisation, and scalability. That’s where other data application frameworks come into play: Each tool offers unique strengths depending on your use case—from simple form-based data collection to enterprise-grade predictive analytics. The real value of custom data applications: Data applications help to make simple data collection processes more efficient than Excel or Email-driven processes. However, they can offer many more benefits, which may include: 1. Centralised source of truth    • Data application: All users work with the same real-time data stored in a central database.    • Email/Excel: Prone to version control issues; multiple file versions can cause confusion and errors. 2. Real-time access & updates    • Data application: Live updates mean users always see the most current data.    • Email/Excel: Manual sending and versioning delays updates and increases the risk of outdated information being used. 3. Better data integrity & validation    • Data application: Built-in validation rules prevent incorrect or incomplete entries.    • Email/Excel: Easy to enter wrong data or overwrite formulas accidentally. 4. Improved collaboration    • Data application: Multiple users can interact with the system simultaneously with proper access controls.    • Email/Excel: Collaboration is clunky—usually involves back-and-forth emails and conflicting edits. 5. Enhanced reporting & insights    • Data application: Can integrate dashboards, visualizations, and advanced analytics.    • Email/Excel: Reporting is static and must be manually created and updated. 6. Integration with other systems    • Data application: Easily connects with APIs, databases, and third-party tools (e.g., CRMs, ERPs, BI tools).    • Email/Excel: Limited integration; often requires manual data export/import. 7. Access control & security    • Data application: Role-based access ensures sensitive data is only visible to authorised users.    • Email/Excel: Risky—spreadsheets can be emailed or forwarded without restrictions. 8. Scalability    • Data application: Designed to handle growing data volumes and user needs.    • Email/Excel: Becomes slow and unwieldy as data grows or processes get more complex. 9. Automation    • Data application: Automates repetitive tasks like notifications, calculations, and workflow steps.    • Email/Excel: Requires manual effort for most processes. 10. Auditability & tracking    • Data application: Can log changes and track user actions for audit/compliance.    • Email/Excel: Hard to trace changes and understand the history of data modifications. What’s next? This is just one small example of how a simple request can be turned into an automated, interactive data solution. In our upcoming blog series, we’ll explore: Stay tuned! Escape the manual data processes in your operational reporting and data collection workflows today! Get in touch to see how we can help you build smarter, automated, enterprise-grade data applications, or learn more about our data application solutions. Get industry insights and expert tips straight to your inbox

Company News

Vizify Analytics achieves ISO 27001 certification and EU DORA compliance

Last year, Vizify Analytics officially achieved ISO 27001:2022 certification, the globally recognised standard for managing information security. This milestone, along with the more recent compliance with EU Digital Operational Resilience Act (DORA), both highlight our commitment to safeguarding your data’s confidentiality, integrity, and availability. What is ISO 27001:2022? ISO 27001:2022 is a comprehensive framework detailing the requirements for an effective Information Security Management System (ISMS). By achieving this certification, we demonstrate our ability to maintain the highest security standards and continuously improve our practices to protect the sensitive data of our customers. What does this mean for our customers? A step further with the EU Digital Operational Resilience Act (DORA) In addition to ISO 27001, if you are a financial entity we are proud to present independent proof we are compliant for your requirements with the latest EU Digital Operational Resilience Act (DORA) (EU 2022/2554), effective January 2025. This regulation strengthens our commitment to operational resilience and cybersecurity, for financial entities. DORA emphasises five key pillars: As a company, we are dedicated to providing cutting-edge data solutions while ensuring top-tier security standards. This achievement is made possible by our dedicated team and reinforces our promise to keep your data safe in an ever-evolving digital landscape.   Get industry insights and expert tips straight to your inbox

Data Analytics, Data Engineering

Can Excel data fuel an enterprise-grade analytics strategy?

Modern cloud-based data technologies are revolutionising how businesses make informed decisions based on trusted data. These cloud technologies offer immense benefits, from seamless scalability to advanced process automation. Despite these advancements, every organisation still suffers from the proliferation of Excel out in the wild, and it’s not going anywhere anytime soon. Excel continues to be the go-to tool for most knowledge workers – mainly due to its general availability, familiarity alongside its ability to input and manipulate data before adding calculations and analysing it. Excel’s versatility encourages creativity, leading to the development of critical shadow data processes that operate outside standard procedures. Whilst important to business operations, these shadow processes pose considerable risks that challenge data governance and control, ultimately undermining trust in the data. Let’s delve into these risks and explore how Excel-based processes can live harmoniously with enterprise-grade data management. The six key risks of Excel shadow data processes Risk 1: Quality, completeness and validity Many Excel functions, tools, and techniques exist to attempt to input complete, clean data – but for every approach we’ve seen, we’ve also seen an end-user creatively circumvent the ‘controls’ that have been put in place. Cell validations can easily be mistakenly overwritten by a simple copy-and-paste action. Columns can be added, or worse, removed too easily. Sheets are renamed, combined, or duplicated. Dates are entered in a wide range of formats. The versatility and accessibility of Excel are at the heart of poor data quality and completeness in Excel. Users can quickly open a file and enter invalid inputs that are not allowable values – they do not adhere to master data management standards. They can easily right-click and delete important information. Excel is hard to govern and often leads to poor data quality. Risk 2: Data silos The core issue with Excel-related data quality is the fact that the process relies on human interaction. Excel data is often human-generated, and those who input data are doing so for their own use – they often are not thinking about how the data can be systematically collected and stored in a database where its value can be leveraged by their colleagues. This leads to data being created for the needs of an individual running a ‘shadow’ process, ignoring the benefits that the data may bring to the wider business. Furthermore, we see that the same data is collected by many people within the same organisation, often duplicating their efforts and reducing the overall productivity of the workforce. The question of productivity is amplified when we consider that data may be collected and input into Excel at a different cadence, which can lead to two people returning different answers to the same question – which then takes time to unpick and reduces overall confidence and trust in the data. Excel can impact an organisation’s ability to produce a single picture of their position without a small army of people trawling their file systems to provide data and insights. Silos prevent building a comprehensive view of the organisation and often prevent the creation of automated processes to free up time for their most valuable resources to make decisions. Risk 3: Security Data holds answers and insights to so many important business topics, many of which are top secret and can provide an organisation with a competitive advantage over their competition. Therefore, it is paramount that the data is kept secure and governed, ensuring only those who have permission and authority to access it can do so. Excel does have security features, but these are not robust and enterprise-grade. Workbooks and sheets can be locked; however, they do not require personal identification through an Identity Provider to access. Passwords are often shared. Equally, whilst the use of file storage has hugely matured over the last decade, with many organisations adopting enterprise-wide solutions such as OneDrive, Google Drive Enterprise, Box (and more!), Excel files are still commonly shared via email as attachments. When combined with the poor user-based security, emailing files with sensitive data increases an organisation’s exposure to a data leak, which may have an adverse impact. Furthermore, emailing Excel files further contributes to the issues discussed in ‘Silos’ – giving multiple versions of the truth. Risk 4: History and versioning Some of the most powerful analytics require daily snapshots of data, allowing consumers to understand what has changed and moved in their data since the last time they looked at their reporting and analytics. Storing historical data in Excel makes it extremely challenging to track changes through time and understand how data is changing and drifting on a longitudinal basis. Volume and computing constraints present the largest challenges; however, discipline on entering and processing the data at regular intervals also contributes to the challenge. Equally, versioning the data can present its own unique challenges. When working with data, it is critical that a user understands where the data has come from, who owns the data, and how frequently the data is updated. With Excel, versioning data and keeping track of it as it moves through emails and file storage systems can make it nearly impossible to clearly identify the most up-to-date dataset to work with. File proliferation, tracking changes, and merging updates before tracing them back to the owners is a tedious, manual task that must be regularly undertaken. Risk 5: Volumes Excel will always have limitations on the amount of data it can store – the most recent version of Excel has a row limit of 1,047,576 rows with 16,384 columns. Whilst this volume of data is a limitation, we rarely see files which max out the row/column limit as other volume-based challenges are met before the hard cap on rows and columns is reached. Whilst there is a cloud version of Excel, most commonly, Excel is worked on locally on the desktop-based application. Files with large volumes become cumbersome and can take a long time to simply open. Furthermore, performance degradation is common where some of the most basic formulas leave the user with

Data Engineering

Is your data ready for AI?

As we step into the era of enterprise AI, data leaders must ask themselves some critical questions. Are your data strategies robust enough to support advanced AI initiatives? Is your data managed and organised efficiently? Do you have the right tools for data visualisation and analytics to extract meaningful insights? Your essential data readiness checklist for AI: In today’s fast-paced world, ensuring your data is AI-ready involves addressing five key prerequisites: The importance of having your data ready for AI cannot be overstated. AI and machine learning models thrive on high-quality data. Without it, AI projects are likely to falter, leading to inaccurate insights and poor decision-making. In today’s competitive landscape, riding the data wave is crucial. Leveraging AI to its fullest potential can drive innovation, efficiency, and a significant competitive edge. At Vizify Analytics, we specialise in comprehensive data strategy, data management, and data visualisation & analytics services. Our expert team is here to ensure your data is primed and ready to power your AI endeavours. Let’s unlock the full potential of your data together. Learn how we can help you navigate the journey to becoming an AI-driven enterprise. Get industry insights and expert tips straight to your inbox

Data Strategy

Ensuring data quality for AI

In the age of AI, data quality is paramount. Without clean, accurate, and up-to-date data, AI and machine learning models struggle to deliver meaningful insights. At Vizify Analytics, we understand the critical role that data quality plays in driving successful AI initiatives. The high cost of poor data quality Did you know that poor data quality costs organisations an average of $15 million per year in losses? This staggering figure, reported by Gartner a while back, underscores the significant financial impact of data quality issues, indicating a widespread problem with data accuracy and reliability. Many businesses still suffer from these issues, leading to substantial revenue losses and missed opportunities due to poor decision-making. Common data quality challenges How Vizify Analytics can help At Vizify Analytics, we specialise in comprehensive data strategy, data management, and data visualisation & analytics services. Our proprietary accelerator for data solutions, Unify, is designed to address the key challenges of maintaining data quality. It enables: By ensuring data quality, you not only improve the accuracy of AI models but also enable comprehensive data-driven decision-making. Unlock the full potential of your data At Vizify Analytics, we are committed to helping you unlock the full potential of your data. With our expertise and advanced solutions, you can ensure that your data is primed and ready to power your AI endeavours.  Learn how we can help you navigate the journey to becoming an AI-driven enterprise with our top-notch data quality accelerator, or explore our range of solutions to see how we can support your data quality initiatives.   Get industry insights and expert tips straight to your inbox

Scroll to Top