Artwork

Player FM - Internet Radio Done Right
Checked 15d ago
Đã thêm cách đây bốn năm
Nội dung được cung cấp bởi Winfried Adalbert Etzel - DAMA Norway. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Winfried Adalbert Etzel - DAMA Norway hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.
Player FM - Ứng dụng Podcast
Chuyển sang chế độ ngoại tuyến với ứng dụng Player FM !
icon Daily Deals

4#3 - Pedram Birounvand - A Paradigm Shift in Data through AI (Eng)

45:54
 
Chia sẻ
 

Manage episode 437585733 series 2940030
Nội dung được cung cấp bởi Winfried Adalbert Etzel - DAMA Norway. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Winfried Adalbert Etzel - DAMA Norway hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

«The notion of having clean data models will be less and less important going forward.»
Unlock the secrets of the evolving data landscape with our special guest, Pedram Birounvand, a veteran in data who has worked with notable companies like Spotify and in private equity. Pedram is CEO and Founder at UnionAll.
Together, we dissect the impact of AI and GenAI on data structuring, governance, and architecture, shedding light on the importance of foundational data skills amidst these advancements.
Peek into the future of data management as we explore Large Language Models (LLMs), vector databases, and the revolutionary RAG architecture that is set to redefine how we interact with data. Pedram shares his vision for high-quality data management and the evolving role of data modeling in an AI-driven world. We also discuss the importance of consolidating company knowledge and integrating internal data with third-party datasets to foster growth and innovation, ultimately bringing data to life in unprecedented ways.
Here are my key takeaways:

  • Always when a new technology arrives, you need to adopt and figure out how to apply the new technology - often by using the new tools for the wrong problem.
  • There is substantial investment in AI, yet the use cases for applying AI are still not clear enough in many companies.
  • There is a gap I how we understand problems between technical and business people. Part of this problem is how we present and visualizer the problem.
  • You need to create space for innovation - if your team is bugged down with operational tasks, you are canibalizing on innovative potential.
  • Incubators in organizations are valuable, if you can keep them close to the problem to solve without limiting their freedom to explore.
  • The goal of incubators is not to live forever, but top become ingrained in the business.
  • CEOs need a combination of internal and external council.
  • Find someone in the operational setting to take ownership from the start.
  • The more data you have to handle the better and clear should your Data Governance strategy be.
  • Small companies have it easier to set clear standards for data handling, due to direct communication.
  • You want to make sure that you solve one problem really well, before moving on.
  • Before intending to change, find out what the culture and the string incentives in your organization are.

LLMs as the solution for Data Management?

  • ChatGP already today very good at classifying information.
  • It can create required documentation automatically, by feeding the right parameters.
  • It can supersede key value search in finding information.
  • This can help to scale Data Governance and Data Management work.
  • Data Management will become more automated, but also much more important going forward.
  • RAG architecture - first build up your own knowledge database, with the help of vectorizing the data into a Vector-database.
  • The results from querying this database are used by the LLM for interpretation.
  • Find a way to consolidate all your input information into a single pipeline to build your knowledge database.
  • Building strong controls on naming conventions will be less important going forward.
  • Vectorized semantic search will be much faster.
  • Entity matching will become very important.
  • Fact tables and dimensional tables become less important.

Data to value

  • Be able to benchmark your internal performance to the market
  • undertand trends and how they affect you.
  • How to use and aggregate third party data is even harder than internal data.
  • You need to find ways to combine internal and third party data to get better insights.
  continue reading

Chương

1. Data Management in the Nordics (00:00:00)

2. Navigating AI Hype and Implementation (00:08:26)

3. Data Literacy Challenges and AI Solutions (00:16:54)

4. Future of Data Management With AI (00:29:43)

5. Optimizing Data Integration for Growth (00:44:47)

74 tập

Artwork
iconChia sẻ
 
Manage episode 437585733 series 2940030
Nội dung được cung cấp bởi Winfried Adalbert Etzel - DAMA Norway. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Winfried Adalbert Etzel - DAMA Norway hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

«The notion of having clean data models will be less and less important going forward.»
Unlock the secrets of the evolving data landscape with our special guest, Pedram Birounvand, a veteran in data who has worked with notable companies like Spotify and in private equity. Pedram is CEO and Founder at UnionAll.
Together, we dissect the impact of AI and GenAI on data structuring, governance, and architecture, shedding light on the importance of foundational data skills amidst these advancements.
Peek into the future of data management as we explore Large Language Models (LLMs), vector databases, and the revolutionary RAG architecture that is set to redefine how we interact with data. Pedram shares his vision for high-quality data management and the evolving role of data modeling in an AI-driven world. We also discuss the importance of consolidating company knowledge and integrating internal data with third-party datasets to foster growth and innovation, ultimately bringing data to life in unprecedented ways.
Here are my key takeaways:

  • Always when a new technology arrives, you need to adopt and figure out how to apply the new technology - often by using the new tools for the wrong problem.
  • There is substantial investment in AI, yet the use cases for applying AI are still not clear enough in many companies.
  • There is a gap I how we understand problems between technical and business people. Part of this problem is how we present and visualizer the problem.
  • You need to create space for innovation - if your team is bugged down with operational tasks, you are canibalizing on innovative potential.
  • Incubators in organizations are valuable, if you can keep them close to the problem to solve without limiting their freedom to explore.
  • The goal of incubators is not to live forever, but top become ingrained in the business.
  • CEOs need a combination of internal and external council.
  • Find someone in the operational setting to take ownership from the start.
  • The more data you have to handle the better and clear should your Data Governance strategy be.
  • Small companies have it easier to set clear standards for data handling, due to direct communication.
  • You want to make sure that you solve one problem really well, before moving on.
  • Before intending to change, find out what the culture and the string incentives in your organization are.

LLMs as the solution for Data Management?

  • ChatGP already today very good at classifying information.
  • It can create required documentation automatically, by feeding the right parameters.
  • It can supersede key value search in finding information.
  • This can help to scale Data Governance and Data Management work.
  • Data Management will become more automated, but also much more important going forward.
  • RAG architecture - first build up your own knowledge database, with the help of vectorizing the data into a Vector-database.
  • The results from querying this database are used by the LLM for interpretation.
  • Find a way to consolidate all your input information into a single pipeline to build your knowledge database.
  • Building strong controls on naming conventions will be less important going forward.
  • Vectorized semantic search will be much faster.
  • Entity matching will become very important.
  • Fact tables and dimensional tables become less important.

Data to value

  • Be able to benchmark your internal performance to the market
  • undertand trends and how they affect you.
  • How to use and aggregate third party data is even harder than internal data.
  • You need to find ways to combine internal and third party data to get better insights.
  continue reading

Chương

1. Data Management in the Nordics (00:00:00)

2. Navigating AI Hype and Implementation (00:08:26)

3. Data Literacy Challenges and AI Solutions (00:16:54)

4. Future of Data Management With AI (00:29:43)

5. Optimizing Data Integration for Growth (00:44:47)

74 tập

Tous les épisodes

×
 
"Dataetik handler også om den måde, vi opfatter brugeren og mennesket, vores demokrati og vores samfund på." / "Data ethics is also about how we perceive the user and the human being, our democracy, and our society." In this episode, we dive into the complexities of data ethics with Gry Hasselbalch, a leading expert on the topic. With experience shaping EU regulations on data and AI ethics, she shares insights on why human values must remain at the core of digital development. We explore the principle of “humans at the center” and why people should be seen as more than just data points or system users. Gry discusses how artificial intelligence and big data challenge this idea and why human interests must take priority over commercial or institutional goals. Here are our hosts' key takeaways: Humans When we talk about data ethics we need to relate to a value set - in out case a European value set, based on human rights. Data Ethics is built around humans - a human-centric principle. That means that human interests are always prioritized, above organizational interests, commercial interests, or machine interests. User is not enough if we talk about human in the center: this will mean different things once the discussion includes AI. We need to talk about the whole human, not just the user or the data about the human. Systems have an influence on our life, and therefore the human needs to be seen as a holistic being. Regulations EU is seen as a «regulatory superpower» that has an ethical starting point when regulating. All cultures will have different interpretation and starting point of what ethics means. But through history we have been able to agree on an ethical baseline, like the charts of human rights. Human dignity is a central part of what ethics mean internationally. Regulation is not everything - remember that regulation happens due to an identified need. Regulations and laws are a guideline, but they do not cover (and cannot cover) the entire topic of data ethics. To ensure a value based approach to data handling, we need to go beyond regulations - talk about this as a societal challenge. Socio-technical Technology is not neutral - it is developed, applied within a certain cultural setting. Technical systems are part of society as much as society is part of the technical systems we develop and use. Maybe we should rather talk about «socio-technical infrastructure». There is a dichotomy in talking about data as something valuable and at the same time as a liability. Data ethics can be viewed as a competitive advantage, a way to induce trust and better an organizations reputation. AI and ethics AI is accelerating the need for ethical data decisions. AI is not created out of the blue, it is very much based on our data, our societal norms, developed by humans. AI is becoming a solution for «everything» - but what does that nean for human-machine relationship? AI is a tool, not a solution. What interests are pushing AI and what impact does AI have on our social systems and our culture? Data Ethics of Power - A Human Approach in the Big Data and AI Era Data Ethics - The New Competitive Advantage Human Power - Seven Traits for the Politics of the AI Machine Age…
 
«Leadership is about sowing the common vision and the common way forward, bringing the people with you.» How can a nuclear physicist transform into a data leader in the industrial sector? Kristiina Tiilas from Finland shares her fascinating journey from leading digitalization programs at Fortum to shaping data-driven organizations at companies like Outokumpu and Kemira. Kristiina provides unique insights into navigating complex data-related projects within traditional industrial environments. With a passion for skydiving and family activities, she balances a demanding career with an active lifestyle, making her an inspiring guest in this episode. We focus on the importance of data competence at the executive level and discuss how organizations can strengthen data understanding without a formal CDO role. Kristiina shares her experiences in developing innovative digitalization games that engage employees and promote a data-driven culture. Through concrete examples rather than technical jargon, she demonstrates how complex concepts can be made accessible and understandable. This approach not only provides a competitive advantage but also transforms data into an integral part of the company’s decision-making processes. Here are my key takeaways: The AI hype became a wake-up moment for Data professionals in Finland taking the international stage. As a leader in dat you need to balance data domain knowledge and leadership skills. Both are important. Leadership is important to provide an arena for your data people to deliver value. As a leader you are in a position that requires you to find ways of making tacit knowledge explicit. If not you are nit able too use that knowledge to train other people or a model. CDO The Chief Data Officer is not really present in Nordic organizations. An executive role for data is discussed much, but in reality not that widespread. Without CDO present, you need to train somebody in the top leadership group to voice data. CDO is different in every organization. Is CDO an intermediate role, to emphasis Data Literacy, or a permanent focus? You can achieve a lot through data focus of other CxOs. Make data topics tangible, this is about lingo, narratives, but also about ways of communicating - Kristiina used gamification as a method. Creating a game to explain concepts in very basic terms with clear outcomes and structure can help with Data Literacy for the entire organization. Data in OT vs. IT Predictions and views on production should be able to be vision also in Operational Settings on all levels. There should not be any restriction in utilizing analytical data in operational settings. Security and timeliness are the big differentiators between OT and IT. These are two angles of the same. They need to be connected. IoT (Internet of Things) requires more interoperability. Extracting data has been a one way process. The influence of Reverse ETL on OT data is interesting to explore further. There are possibilities to create data driven feedback loops in operations. Data Teams If you start, start with a team of five: One who knows the data (Data Engineering) One who knows the business One who understands Analytics / AI One who understands the users / UX One to lead the team You can improve your capabilities one step at a time - build focus areas that are aligned with business need an overall strategy. If you expect innovation from your data team, you need to decouple them from the operational burden. Show your value in $$$.…
 
"Vi modellerer for å forstå, organisere og strukturere dataene." / "We model to understand, organize, and structure the data." This episode with Geir Myrind, Chief Information Architect, offers a deep dive into the value of data modeling in organizations. We explore how unified models can enhance the value of data analysis across platforms and discuss the technological development trends that have shaped this field. Historical shifts toward more customized systems have also challenged the way we approach data modeling in public agencies such as the Norwegian Tax Administration. Here are my key takeaways: Standardization Standardization is a starting point to build a foundation, but not something that let you advance beyond best practice. Use standards to agree on ground rules, that can frame our work, make it interoperable. Conceptual modeling is about understanding a domain, its semantics and key concepts, using standards to ensure consistency and support interoperability. Data Modeling Modeling is an important method to bridge business and data. More and more these conceptual models gain relevance for people outside data and IT to understand how things relate. Models make it possible to be understood by both humans and machines. If you are too application focused, data will not reach its potential and you will not be able to utilize data models to their full benefits. This application focus which has been prominent in mainstream IT for many years now is probably the reason why data modeling has lost some of its popularity. Tool advancement and new technology can have an impact on Data Management practices. New tools need a certain data readiness, a foundation to create value, e.g. a good metadata foundation. Data Modeling has often been viewed as a bureaucratic process with little flexibility. Agility in Data Modeling is about modeling being an integrated part of the work - be present, involved, addressed. The information architect and data modeling cannot be a secretary to the development process but needs to be involved as an active part in the cross-functional teams. Information needs to be connected across domains and therefore information modeling should be connected to business architecture and process modeling. Modeling tools are too often connected only to the discipline you are modeling within (e.g. different tools for Data vs. Process Modeling). There is substantial value in understanding what information and data is used in which processes and in what way. The greatest potential is within reusability of data, its semantics and the knowledge it represents. The role of Information Architect Information Architects have played a central role for decades. While the role itself is stable it has to face different challenges today. Information is fluctuant and its movement needs to be understood, be it through applications or processes. Whilst modeling is a vital part of the work, Information Architects need to keep a focus on the big picture and the overhauling architecture. Information architects are needed both in projects and within domains. There is a difference between Information and Data Architects. Data Architects focus on the data layer, within the information architecture, much closer to decisions made in IT. The biggest change in skills and competency needs for Information Architects is that they have to navigate a much more complex and interdisciplinary landscape. Metadata Data Catalogs typically include components on Metadata Management. We need to define Metadata broader - it includes much more than data about data, but rather data about things.…
 
"Den største utfordringen, det viktigste å ta tak i, det er å standardisere på nasjonalt nivå. / The biggest challenge, the most important thing to address, is standardizing at the national level." The healthcare industry is undergoing a significant transformation, driven by the need to modernize health registries and create a cohesive approach to data governance. At the heart of this transformation is the ambition to harness the power of data to improve decision-making, streamline processes, and enhance patient outcomes. Jørgen Brenne, as a technical project manager, and Marte Kjelvik’s team, have been instrumental in navigating the complexities of this change. Their insights shed light on the challenges and opportunities inherent in healthcare data modernization. Here are my key takeaways: Healthcare data and registry Its important to navigate different requirements from different sources of authority. To maintain comprehensive, secure, and well-managed data registries is a challenging task. We need a national standardized language to create a common understanding of health data, what services we offer within healthcare and how they align. Authorities need also to standardize requirements for code and systems. National healthcare data registry needs to be more connected to the healthcare services, to understand data availability and data needs. Competency Data Governance and Data Management are the foundational needs the registry has recognized. Dimensional Modeling was one of the first classes, they trained their data team on, to ensure this foundational competency. If the technology you choose supports your methodology, your recruitment of new resources becomes easier, since you don’t need to get experts on that very methodology. Models User stories are a focus point and prioritized. Data Lineage (How data changed through different systems) is not the same as Data Provenience (Where is the datas origin). You need both to understand business logic and intent of collection) - User stories can help establish that link. Understanding basic concepts and entities accounts for 80% of the work. Conceptual models ensured to not reflect technical elements. These models should be shareable to be a way to explain your services externally. Could first provides an open basis to work from that can be seen as an opportunity. There are many possibilities to ensure security, availability, and discoverability. Digitalization in Norwegian public services has brought forth a set of common components, that agencies are encouraged to use across public administration. Work based on experiences and exchange with others, while ensuring good documentation of processes. Find standardized ways of building logical models, based on Data Contracts. By using global business keys, you can ensure that you gain structured insight into the data that is transmitted. Low Code tools generate generic code, based on the model to ensure effective distribution and storage of that data in the registry. The logical model needs to capture the data needs of the users. Data Vault 2.0 as a modeling tool to process new dats sources and adhering to a logical structure. There is a discipline reference group established to ensure business alignment and verification of the models. Data should be catalogued as soon as it enters the system to capture the accompanying logic. Data Vault Adaptable to change and able to coordinated different sources and methods. It supports change of formats without the need to change code. It makes parallel data processing possible at scale. Yet due to the heterogeneity of data vault, you need some tool to mange.…
 
«Data Management is an interesting one: If it fails, what’s the feedback loop?» For the Holiday Special of Season 4, we’ve invited the author of «Fundamentals of Data Engineering», podcast host of the «Joe Reis Show», «Mixed Model Arts» sensei, and «recovering Data Scientist» Joe Reis. Joe has been a transformative voice in the field of data engineering and beyond. He is also the author of the upcoming book with the working title "Mixed Model Arts", which redefines data modeling for the modern era. This episode covers the evolution of data science, its early promise, and its current challenges. Joe reflects on how the role of the data scientist has been misunderstood and diluted, emphasizing the importance of data engineering as a foundational discipline. We explore why data modeling—a once-vital skill—has fallen by the wayside and why it must be revived to support today’s complex data ecosystems. Joe offers insights into the nuances of real-time systems, the significance of data contracts, and the role of governance in creating accountability and fostering collaboration. We also highlight two major book releases: Joe’s " Mixed Model Arts ", a guide to modernizing data modeling practices, and our host Winfried Etzel’s book on federated Data Governance , which outlines practical approaches to governing data in fast-evolving decentralized organizations. Together, these works promise to provide actionable solutions to some of the most pressing challenges in data management today. Join us for a forward-thinking conversation that challenges conventional wisdom and equips you with insights to start rethinking how data is managed, modeled, and governed in your organization. Some key takeaways: Make Data Management tangible Data management is not clear enough to be understood, to have feedback loops, to ensure responsibility to understand what good looks like. Because Data Management is not always clear enough, there is a pressure to make it more tangible. That pressure is also applied to Data Governance, through new roles like Data Governance Engineers, DataGovOps, etc. These roles mash enforcing policies with designing policies. Data Contracts Shift Left in Data needs to be understood more clearly, towards a closer understanding and collaboration with source systems. Data Contracts are necessary, but it’s no different from interface files in software. It’s about understanding behavior and expectations. Data Contracts are not only about controlling, but also about making issues visible. Data Governance Think of Data Governance as political parties. Some might be liberal, some more conservative. We need to make Data Governance lean, integrated and collaborative, while at the same time ensuring oversight and accountability. People need a reason to care about governance rules and held accountable. If not Data Governance «(...) ends up being that committee of waste.» The current way Data Governance is done doesn’t work. It needs a new look. Enforcing rules, that people don’t se ant connection to or ownership within are deemed to fail. We need to view ownership from two perspectives - a legal and a business perspective. They are different. Data Modeling Business processes, domains and standards are some of the building blocks for data. Data Modeling should be an intentional act, not something you do on the side. The literature on Data Modeling is old, we are stuck in a table-centric view of the world.…
 
«We want to make data actionable.» Join us for an engaging conversation with Shuang Wu, Mesta's lead data engineer. We delve into the concept of platforms and explore how they empower autonomous delivery teams, making data-driven decisions a central part of their strategy. Shuang discusses the intricate process of evolving from a mere data platform to a comprehensive service platform, especially within organizations that aren't IT-centric. Her insights emphasize a lean, agile approach to prioritize use cases, focusing on quick iterations and prototypes that foster self-service and data democratization. We explore the potential shift towards a decentralized data structure where domain teams leverage data more effectively, driving operational changes and tangible business value in their pursuit of efficiency and impact. My key learnings: It’s not just about gaining insights, but also about harmonizing and understanding data in context. Find your SMEs and involve them closely - you need insight knowledge about the data and pair that with engineering capabilities. Over time the SMEs and the central data team share experiences and knowledge. This creates a productive ground for working together. The more understanding business users gain on data, the more they want to build themselves. Central team delivers core data assets in a robust and stable manner. Business teams can build on that. The Data You can integrate and combine internal data with external sources (like weather data, or road network data) to create valuable insights. Utilizing external data can save you efforts, since it often is structured and API ready. Dont over-engineer solutions - find you what your user-requirements are and provide data that match the requirements, not more. Use an agile approach to prioritize use cases together with your business users. Ensure you have a clear picture of potential value, but also investment and cost. Work in short iterations, to provide value quickly and constantly. Understand your platform constrains and limitations, also related to quality. Find your WHY! Why am I doing the work and what does that mean when it comes to prioritization? What is the value, impact and effort needed? Service Platform: Is about offering self-service functionality. Due to the size of Mesta it made sense to take ownership for many data products centrally, closely aligned with the platform. Build it as a foundation, that can give rise to different digitalization initiatives. If you want to make data actionable they need to be discoverable first. The modular approach to data platform allows you to scale up required functionality when needed, but also to scale to zero if not. Verify requirements as early as you can. Working with business use cases Visibility and discoverability of data stays a top priority. Make data and AI Literacy use case based, hands-on programs You need to understand constrains when selecting and working with a business use case. Start with a time-bound requirements analysis process, that also analyses constraints within the data. Once data is gathered and available on the platform, business case validity is much easier to verify. Gather the most relevant data first, and then see how you can utilize it further once it is structured accordingly. Quite often ideas originate in the business, and then the central data team is validating if the data can support the use case.…
 
«I think we are just seeing the beginning of what we can achieve in that field.» Step into the world of data science and AI as we welcome Victor Undli, a leading data scientist from Norway, who shares his insights into how this field has evolved from mere hype to a vital driver of innovation in Norwegian organizations. Discover how Victor's work with Ung.no, a Norwegian platform for teenagers, illustrates the profound social impact and value creation potential of data science, especially when it comes to directing young inquiring minds to the right experts using natural language processing. We'll discuss the challenges that organizations face in adopting data science, particularly the tendency to seek out pre-conceived solutions instead of targeting real issues with the right tools. This episode promises to illuminate how AI can enhance rather than replace human roles by balancing automation with human oversight. Join us as we explore the challenges of bridging the gap between academia and industry, with a spotlight on Norway's public sector as a cautious yet progressive player in tech advancement. Victor also shares his thoughts on developing a Norwegian language model that aligns with local values and culture, which could be pivotal as the AI Act comes into play. Learn about the unique role Norway can adopt in the AI landscape by becoming a model for small countries in utilizing large language models ethically and effectively. We highlight the components of successful machine learning projects: quality data, a strong use case, and effective execution, and encourage the power of imagination in idea development, calling on people from all backgrounds to engage. Here are my key takeaways: Get started as Data Scientist Expectations from working with cutting edge tech, and chasing the last percentage of precision. Reality is much more messy. Time management and choosing ideas carefully is important. «I end up with creating a lot of benchmark models with the time given, and then try to improve them in a later iteration.» Data Science studies is very much about deep diving into models and their performance, almost unconcerned with technical limitations. A lot of tasks when working with Data Science are in fact Data Engineering tasks. Closing the gap between academia and industry is going to be hard. Data Science is a team sport - you want someone to exchange with and work together with. Public vs. Privat There is a difference between public and privat sector in Norway. Public sector in Norway is quite advanced in technological development. Public sector acts more carefully. Stakeholder Management and Data Quality It is important to communicate clearly and consistently with your stakeholders. You have to compromise between stakeholder expectation and your restrains. If you don’t curate your data correctly, it will loose some of its potential over time. Data Quality is central, especially when used for AI models. Data Curation is also a lot about Data Enrichments - filling in the gaps. AI and the need for a Norwegian LLM AI can be categorized into the brain and the imagination. The brain is to understand, the imagination is to create. We should invest time into creating open source, Norwegian LLM, as a competitive choice. Language encapsulates culture. You need to embrace language to understand culture. Norways role is a sa strong consumer of AI. That also means to lead by example. Norway and the Nordic countries can bring a strong ethical focus to the table.…
 
«Focusing on the end-result you want, that is where the journey starts.» Curious about how Decision Science can revolutionize your business? Join us as our guest Rasmus Thornberg from Tetra Pak guides us through his journey of transforming complex ideas into tangible, innovative products. Aligning AI with business strategies can be a daunting task, especially in conservative industries, but it’s crucial for modern organizations. This episode sheds light on how strategic alignment and adaptability can be game-changers. We dissect the common build-versus-buy dilemma, emphasizing that solutions should focus on value and specific organizational needs. Rasmus's insights bring to life the role of effective communication in bridging the divide between data science and executive decision-making, a vital component in driving meaningful change from the top down. Learn how to overcome analysis paralysis and foster a learning culture. By focusing on the genuine value added to users, you can ensure that technological barriers don't stall progress. Rasmus shares how to ensure the products you build align perfectly with user needs, creating a winning formula for business transformation. Here are my key takeaways: Decision Science You need to understand the cost of error of a ML/AI application Cost of error limits the usability of AI Decision Science is a broader take on Data Science, combining Data Science with Behavioral Science. Decision Science covers cognitive choices that lead to decisions. Decision Science can just work in close proximity to the end user and the product, something that has been a challenge for many. From Use Case to product Lots of genAI use cases are about personal efficiency, not to improve any specific organizational target. Differentiating between genAI and analytical AI can help ton understand what the target is. genAI hype has created interest from many. You can use it as a vessel to talk about other things related to AI or even to push Data Governance. When selecting use cases, think about adoption and how it will affect the organization at large. When planning with a use case, find where uncertainties are and ability for outcomes. It’s easy to jump to the HOW, by solving business use cases, but you really need to identify the WHY and WHAT first. Analysis-paralysis is a really problem, when it comes to move from ideation to action, or from PoC to operations. «Assess your impact all the time.» You need to have a feedback loop and concentrate on the decision making, not the outcome. A good decision is based on the information you had available before you made a decision, not the outcome of the decision. A learning culture is a precondition for better decision making. If you correct your actions just one or two steps at a time, you can still go in the wrong direction. Sometimes you need to go back to start and see your entire progress. The need for speed can lead to directional constrains in your development of solutions. Be aware of measurements and metrics becoming the target. When you build a product, you need to set a treshold for when to decommission it. Strategic connection The more abstract you get the higher value you can create, but the risk also gets bigger. The biggest value we can gain as companies is to adopt pur business model to new opportunities. The more organizations go into a plug-n-play mode, the less risk, but also less value opportunities. Industrial organizations live in outdated constrains, especially when it comes to cost for decision making. Dont view strategy as a constrain, but rather a direction that can provide flexibility.…
 
«We made a transition from being a company that produces a lot of data, to a company which has control over the data we are producing.» Unlock the secrets of optimizing supply chains with data and AI through the lens of TINE, Norway's largest milk producer. Our guest, Olga Sergeeva, head of the analytics department at Tine, takes us on her journey from a passion for mathematics to spearheading digital transformation in the fast-moving consumer goods industry. Ever wondered how organizations can successfully integrate AI tools into their business processes? This episode dives into the uneven digital maturity across departments and the strategies used to overcome these challenges. We discuss how data visualization tools act as a gateway to AI, making advanced algorithms accessible without needing to grasp the technical nitty-gritty. Olga shares how TINE’s data department empowers users by providing crucial expertise while ensuring they understand the probabilistic nature of AI-generated data. Finally, discover how teamwork and a systematic approach can drive data adoption to new heights. From improving milk quality with predictive algorithms to optimizing logistics and production planning, we explore practical AI use cases within Tine's supply chain. Here are my takeaways: Mathematics is a combination of beauty, art and structure. Find your way in data and digitalization before jumping on the AI-train. Ensure that people can excel at what they are best at - this is what Tine tries to do for the farmers. Data only has a value, when it can be used - find ways to use data from analytics to prediction to more advanced algorithms. Create a baseline through a maturity assessment to see how you can tailor your work to the different business units. Follow up and monitor the usage of your data tools in the different areas of your business Create a gateway into data for your business users: Once that gateway is established it is also easier to introduce new tools. Data Literacy has a limit - not everyone in the business needs to be a data expert. Yet you need someone you can trust to enable and provide guidance - the Data team. Business users need to understand the difference between concrete answers and probability. How do you transform a complex organization without breaking the culture? Your data/digital/AI transformation team is key in ensuring good transformative action without breaking culture. Ensure you have good ambassadors for your data work in the Business Units, that what to transfer their knowledge in their respective units. Create a network of data-interested people, that help to drive adoption. Engage people by showing an initial value. Offer courses and classes for people to learn and understand more, but also to spread the word about your focus points. Inhouse courses provided by your own staff can increase the confidence in your data team. AI can mean different things to different people. It is important to define AI in your setting. Don’t replace existing work process with AI-driven solutions, just for the sake of it. Find ways to focus on where improvement actually provides business value. When you think of a new AI project, you have several options: Develop in house Buy off the shelf Do nothing Option two should be your preferred solution AI strategy is part of a larger ecosystem, with conditions to adhere to. Data and algorithms should become interconnected, also visually represented. «Always remember your core business.»…
 
"Det er vanskelig å komme seg ut av det jeg kaller: et excel-helvete. / It is hard to escape, what I call: Excel-hell." Are you wondering how medium-sized companies can handle data strategy and data governance effectively? Join us as we talk to May Lisbeth Øversveen, who has over 23 years of experience in the industry, and shares her expertise from Eidsiva Bredbånd. She provides us with insight into how to work with data maturity and the implementation of data strategy. How can mid-sized companies balance resources and create effective data governance strategies? May Lisbeth and I explore this topic in depth. We talk about the importance of involving the business units early in the process in order to create ownership and commitment around the improvement measures. Here are my key takeaways: The way we talk about data as a profession has changed, the lingo has changed and we adopt to trends. To display and evaluate data from different sources that are not connected, excel becomes the tool of choice. There is a very calculated amount of resources, that limit your ability to set up substantial teams to work exclusively on eg. Data Governance. Data Governance in SME (Small Medium sized enterprises) can be modeled as a repeatable process that incrementally enhances your data governance maturity. Identify sizable initiatives, ensure that they can be handled with a set amount of resources, and create metrics that enable you to track your progress. You need to find ways to ensure observability and monitoring over time. Don’t create something that you have no resources to maintain and improve going forward. To identify the right initiatives at the right time, you need to ensure a close collaborating with your business users. Ensure transparent and traceable ownership of the initiatives from the business side. To create a movement and engagement in data requires continuous and structured communication. Data Maturity Assessment There is a need for speed and agility in SME, to ensure compatibility. Data Maturity Assessments are a welcome introduction to ensure that you create a baseline when working with data. There are advantages to both an internal view and to get some external perspective on your data maturity. Results from a maturity assessment can be a reality check that is not always easy to convey, yet you need to be realistic. Maturity assessments should ideally be both Modeled/tailored to the needs of the organizations in question. Repeatable and comparable over time and across organizations. Good assessments cover both. To initially increase your maturity you can pick different tasks: Low hanging fruits «Duct-taped» operations that you can finally rectify Find known problems that are visible Find pinpoints for your business users It is good to start with cases that are understandable for business users, create interest, and can easy show value to leadership - this is what creates buy-in. You need to ensure that you keep a clear communication towards bigger, more substantial tasks, so your resources are not limited to quick win actions. Data Strategy Data Strategy needs to be closely aligned with business strategy. Have a clear vision of where you want to go. To have a structure approach run our data strategy on how to handle both people, process, and technology is important for any work with data. Technology is not the staring point, but rather a consequence of your strategic choices, your organizational setup and your available resources. You need to include well-defined metrics to track progress. Find metrics that are closely connected to business outcome and value creation.…
 
«The notion of having clean data models will be less and less important going forward.» Unlock the secrets of the evolving data landscape with our special guest, Pedram Birounvand, a veteran in data who has worked with notable companies like Spotify and in private equity. Pedram is CEO and Founder at UnionAll. Together, we dissect the impact of AI and GenAI on data structuring, governance, and architecture, shedding light on the importance of foundational data skills amidst these advancements. Peek into the future of data management as we explore Large Language Models (LLMs), vector databases, and the revolutionary RAG architecture that is set to redefine how we interact with data. Pedram shares his vision for high-quality data management and the evolving role of data modeling in an AI-driven world. We also discuss the importance of consolidating company knowledge and integrating internal data with third-party datasets to foster growth and innovation, ultimately bringing data to life in unprecedented ways. Here are my key takeaways: Always when a new technology arrives, you need to adopt and figure out how to apply the new technology - often by using the new tools for the wrong problem. There is substantial investment in AI, yet the use cases for applying AI are still not clear enough in many companies. There is a gap I how we understand problems between technical and business people. Part of this problem is how we present and visualizer the problem. You need to create space for innovation - if your team is bugged down with operational tasks, you are canibalizing on innovative potential. Incubators in organizations are valuable, if you can keep them close to the problem to solve without limiting their freedom to explore. The goal of incubators is not to live forever, but top become ingrained in the business. CEOs need a combination of internal and external council. Find someone in the operational setting to take ownership from the start. The more data you have to handle the better and clear should your Data Governance strategy be. Small companies have it easier to set clear standards for data handling, due to direct communication. You want to make sure that you solve one problem really well, before moving on. Before intending to change, find out what the culture and the string incentives in your organization are. LLMs as the solution for Data Management? ChatGP already today very good at classifying information. It can create required documentation automatically, by feeding the right parameters. It can supersede key value search in finding information. This can help to scale Data Governance and Data Management work. Data Management will become more automated, but also much more important going forward. RAG architecture - first build up your own knowledge database, with the help of vectorizing the data into a Vector-database. The results from querying this database are used by the LLM for interpretation. Find a way to consolidate all your input information into a single pipeline to build your knowledge database. Building strong controls on naming conventions will be less important going forward. Vectorized semantic search will be much faster. Entity matching will become very important. Fact tables and dimensional tables become less important. Data to value Be able to benchmark your internal performance to the market undertand trends and how they affect you. How to use and aggregate third party data is even harder than internal data. You need to find ways to combine internal and third party data to get better insights.…
 
«Don’t go over to the cloud without truly understanding what you are getting into.» Unlock the secrets of cloud migration with industry expert Jonah Andersson, a senior Azure consultant and Microsoft MVP from Sweden. Learn how to seamlessly transition your data systems to the cloud. Jonah shares her knowledge on cloud infrastructure, AI integration, and the balance between Edge AI and Cloud AI, providing a comprehensive guide to building resilient cloud systems. Explore the intersection of IT consulting, Data Governance, and AI in cloud computing, with a specific focus on security and agile workflows. Understand the critical impact of GDPR on data management and the essential collaboration between IT consultants and data governance experts. Jonah and I delve into the growing trend of edge AI, driven by security and latency concerns, and discuss responsible AI usage, emphasizing security and privacy. Learn how to navigate the complexities of multi-cloud strategies and manage technical debt effectively within your organization. Jonah offers tips on avoiding common migration mistakes and highlights the significance of using tools like Azure's Cloud Adoption Framework. Whether you're modernizing outdated systems, merging companies, or transitioning to a new cloud provider, this episode equips you with the essential knowledge and resources to ensure a successful and strategic cloud migration journey. Join us for a deep dive into the future of cloud computing with an industry leader. Here are my key takeaways: Azure services can be tailored to use cases and service needs. But you need to understand your requirements and needs. Once you understand what you need to do, you need to gain perspective in the how - what methods and processes are supported? Think security at every step. Security with integrations is an important part, we need to focus more on. Bringing different competencies together is a vital ingredient in building resilient applications. Cloud is about where your data resides, how you protect it and how you handle big data. Cloud should support the entire data lifecycle. Cloud and AI «Cloud computing is the backbone of AI.» AI pushed for Edge AI, in addition to cloud. Reasons for Edge AI are latency, but mainly security. Cloud can provide an attack surface for eg. data poisoning, lack of control for training data, etc. AI tools can pose concerns on what and how you are exposing data. Awareness and education are important, when building something with AI. You need to at least understand your input to track your output - explainability starts with understanding of your data sources. There is a risk to Model Governance by on-perm due to the level of competancy needed. Multi-Cloud vs. Single Cloud This is one of the questions to consider at the beginning of a cloud migration. Drivers for multi cloud strategy are: Avoiding proprietary vendor lock-in, Existing applications or infrastructure in another platform, Choosing according to the quality of services offered by cloud vendors. If you choose multi cloud for automated resource management, you need to consider support platforms. Cloud Migration Reason for cloud migration boil often down to gaining resiliency in the cloud, due to redundancy. You need to uphold Data Quality not just after the migration but also during the transit. Cloud migration requires strategy. There are great resources to help with your cloud migration, like the Cloud Adaption framework or the Well-Architected framework. Use observability and orchestration tools for your migration process. Ensure you understand your cost, and can optimize it to fit with your needs.…
 
"For me, it really goes back to basic human needs, almost." How can the sense of community support Data Professionals? We dive deep into this question with Tiankai Feng, a prominent figure in data governance and the Data Strategy and Data Governance lead at ThoughtWorks Europe. In this season four premiere of MetaDAMA, Tiankai shares his unique journey and how his passion for music plays a pivotal role in his professional and personal life. His story underscores the multidimensional nature of data professionals and the importance of a supportive community. Building and nurturing internal communities is crucial. Tiankai and Winfried discuss how data governance conferences serve as therapeutic spaces, offering more than just professional development—they provide emotional and communal support. We explore various community models like grassroots movements and rotational leadership, highlighting the indispensable role of leadership in fostering these spaces. Recognizing and valuing community leaders is essential for sustaining these supportive networks within organizations. Lastly, we delve into practical strategies for building strong data management communities. From integrating community introductions into onboarding processes to using these groups as recruitment tools, we cover it all. We also examine how company culture shapes the type of communities that flourish and the support provided by external organizations like DAMA. Joining communities helps alleviate isolation, share solutions, and foster a connected environment. Tune in to learn how to make community engagement a cornerstone of data governance and elevate both personal and professional growth. Here are my key takeaways: Communities in organizations Community is needed as a counterpart to the transactional behavior in a workplace. Communities of Practice is an established model, that comes from a technical side, methodology focuses. Communities can create new lines of communication, that can help spread a sense of belonging in an organization, beyond a specific department or team. Leadership needs to accept that being in a Community is also part of the job. Community leaders need recognition and to be valued for their work. The «smartest person in the room» should not be the leader of a Community - this can turn a community into a lecture setting. Ensure that organizational hierarchies are «flattened» in a Community, to support physiological safety and freedom to speak. Ensure you have some rules of engagement or code of conduct in place. Breakout groups can be a way to get everyone to participate actively in the Community. Leadership plays an important role to promote Communities in an organization. Well functioning Communities of Practice can become a selling point for recruitment. DAMA as a Community A Community for Data professionals outside their organizations. The most outstanding impact DAMA can have is networking in a broad community, both local/national, but also internationally across sectors. There is an element of mentioning and coaching that a community of this size can offer. Another factor can be talent-sourcing: both for organizations, but also for job-seekers. Upscaling and learning are a great part of the DAMA Community, also including the CDMP certification. You need to find your balance between domain or sector specific communities and large data communities like DAMA. SOME Community You need to be conscious about what you are reading on SOME. It can be a great place to provoke some new thoughts and get perspective on your work. There is certainly an entertainment factor to using SOME. Humor can heal a lot, and laughing about challenges we face as Data folks is like therapy.…
 
«Hva er mulig å gjøre med disse teknologiene når de blir 10 ganger så bra som de er idag? / What might be possible to do with these technologies when they become 10 times as good as they are today?» Can moonshot innovation really be the key to solving challenges that traditional methods fail to address? Today, we're thrilled to welcome Yngvar Ugland from DNB's New Tech Lab, who will unravel the complexities of digital transformation and share his unique insights from both corporate and startup ecosystems. From breaking the mold of the classic "people, process, technology" framework to stressing the importance of customer-centric approaches, Yngvar’s perspective offers a refreshing and profound look into fostering genuine innovation within established enterprises. Technological innovation isn't always smooth sailing, and Yngvar helps us understand the friction between traditional mindsets and innovative approaches. Balancing high-trust societies against the urgency-driven dynamics of capitalism, we discuss the complex landscape of AI hype and explore technologies like GPT-3 and GPT-4. With an optimistic outlook, Yngvar encourages us to embrace the transformative potential of generative AI, highlighting the unprecedented opportunities that lie ahead. Tune in to gain a deeper understanding of the ever-evolving world of technology and digital transformation. Here are my key takeaways: Yngvar has build and is leading the as he calls it «Moon-shoot unit at DNB». What do we need to do to actually implement and adopt to new technology and ways of working? How do we think tech for people in tech? We can identify three needed dimensions for change: a data / tech component, a business component and a change component. There is a difference between necessary and sufficient - just because a change is necessary, doesn’t mean that the proposed solution is sufficient. You need to find ways to navigate uncertainty, be active beyond concrete hypothesis testing, or tech-evaluation. For organizations to be successful, you need to coordinate both maintenance, improvement and innovation - it’s not one of those, but all there in concert that can ensure success over time. Innovation and digital transformation is not a streamlined process. Uncertainty offers a space for opportunity. We use the term agile without grasping its true meaning - an inspect-and-adapt mindset is key to agile. The development from GPT-1 through GPT-2 to GPT-3 is an example for the exponential development of technology. The digital infrastructure in Norway, that can utilize data and technology for value creation across public and private sectors is a reason for our success. The difference to the US market is that there are large cooperations that take on societal challenges. How our society is structure has an influence on how we perceive the need for innovation. It is natural to meet resistance in change and innovation. To iterate effectively you really need to live a mindset build around FAIL - First Attempt in Learning. We overestimate the effect of technology in the short term and significantly underestimate the long term.…
 
«We can get lost in politics, when what we should be discussing is policy.» In this seasons final episode, we’re thrilled to have Ingrid Aukrust Rones, a policy expert with a rich background in the European Commission and Nordheim Digital, shed light on the role of the global geopolitical landscape in shaping digital policies. Explore with us the dominant influence of big tech from the US to China, and how the EU's regulatory approach aims to harmonize its single market while safeguarding privacy and democracy. Ingrid breaks down the contrasting digital policies of these regions and discusses how the EU's legislative actions are often driven by member states' initiatives to ensure market cohesion. We also chart the historical shifts in digital policy and market regulations from the 1980s to the present, highlighting key moments like China's WTO entry and the introduction of GDPR. Lastly, we delve into the future landscape of digital societies and the challenges nation-states face within the context of Web3. Ingrid emphasizes the concentration of power in big tech and its potential threat to democracy, while also lauding the EU’s robust regulatory measures like the Digital Markets Act and the Digital Services Act. Here are my key takeaways: Geopolitics our security, economy, the national and international system relies on data. How data is collected, stored, protected, used, transferred, retained.. happens as much across boarders as within. Data Strategy on this geopolitical level is about creating a digital autonomy, not being reliant on big international enterprises, but for our political system to stay sovereign US is based on a liberal, free market model that is very innovation friendly. China is based on a very controlled environment, with limited access to their domestic market. Incubation of local companies, shield from global competition. The EU is setting the regulatory standard. Freedom is balanced with other values, like fairness or democracy. We need to talk about the role that big tech has on the global scene. Geopolitical impact on digital policies. Ingrid has a role between policy and business, coordinating and finding opportunities between both. EU has set the global standard in how we could deal with data and AI from a regulatory perspective. Politics are the decisions we make to set the direction for society. «Policy is the plan and implementation of what is decided through politics.» Cultural differences influence how we perceive, utilize and establish global policies, but also how we work with data in a global market. We have an issue if we only think in 4-5 year election cycles for tackling long term issues. The EU Regulation is the biggest tool the EU has. «We are always in competition with technology, because technology develops so fast, and legislation develops so slowly.» You can see a change in responsibility for enforcement of EU rules and regulations, where implementation is moved from national responsibility to EU responsibility. The EU system is not any easy system to understand from the outside. The rise of Big Tech We can go back to the anti-trust laws from the 1980s that opened for much more monopolistic behavior. The rise of the internet had a large influence on big tech. The liability shield was a prerequisite for social media platforms to gain traction. Big tech has created dependency for other organizations due to eg. their infrastructure offerings. We need to be aware of that concentration of power in the market. Big Tech is not just leading but also regulating the development of the market. Bigger companies that are competing with Big Tech, feel their influence and size the most.…
 
Loading …

Chào mừng bạn đến với Player FM!

Player FM đang quét trang web để tìm các podcast chất lượng cao cho bạn thưởng thức ngay bây giờ. Đây là ứng dụng podcast tốt nhất và hoạt động trên Android, iPhone và web. Đăng ký để đồng bộ các theo dõi trên tất cả thiết bị.

 

icon Daily Deals
icon Daily Deals
icon Daily Deals

Hướng dẫn sử dụng nhanh

Nghe chương trình này trong khi bạn khám phá
Nghe