• Maximizing Productivity with GitHub Copilot Training

    Having worked in technology for quite a while, and most of that time as a software developer, can definitely confirm that you get to know the “trade secrets” of the industry and how they impact productivity and bottom-line results.

    A significant challenge we frequently encounter is the disconnect between expenditure on new technology and the lack of investment in related staff training. This stems partly from the belief that technologists are responsible for their own training, but it’s also a systemic issue: procurement of software and training are often treated as unrelated, independent, transactions.

    Vendors, including Microsoft, have gotten around this by creating course work that’s online and freely accessible, in addition to their traditional offerings.   By offering training free of charge, they’re able to inform the client of its availability and ensure their products will be used by people trained on the technology, which goes a long way towards increasing client satisfaction.  

    However, with the emergence of AI assisted software development, not training staff on how to use the tools has an outsized impact.   You may not see the entirety of the potential productivity and labor savings and greatly underestimate its value.  

    Conversely, well-trained staff will apply these readily available tools effectively, fundamentally transforming the software development process and delivering obvious productivity increases and labor savings.

    Microsoft offers extensive, free training and certification on GitHub Copilot, a key developer productivity assistant. Maximizing labor savings and productivity is directly correlated to how many staff members are trained on this tool.

    Microsoft Learn Portal (Copilot): https://learn.microsoft.com/en-us/training/browse/?terms=GitHub%20copilot

    Microsoft Github Copilot Certification:  https://learn.microsoft.com/en-us/credentials/certifications/github-copilot/?practice-assessment-type=certification

    Github Copilot Certification: https://learn.github.com/certification/COPILOT

    Github Video Tutorial: https://github.com/features/copilot/tutorials

    Github Learn Portal (Copilot) : https://learn.github.com/learning?product=GitHub+Copilot

    Linked In Learning: https://www.linkedin.com/learning/topics/copilot-22672489

    Author: Jim Fahrenbach

  • Beyond the Buzzwords: The Real Pre-Work for Cloud AI and Analytics in 2025

    In the world of enterprise tech, buzzwords are designed to short-circuit complexity. In 2025, terms like “Unified Data Platform,” “Generative AI-Ready,” and “Data Fabric” promise a seamless path to success. They paint a picture of an integrated, intelligent future, just a simple installation away.

    While today’s cloud ecosystems are more powerful than ever, that promise often overlooks the foundational work required for success. The most common pain points in any cloud analytics or AI initiative have less to do with the new technology and more to do with the existing environment it’s connecting to.

    Think of it as building a state-of-the-art smart home. You can’t install the AI assistants, automated lighting, and security systems until you’ve ensured the lot is graded, the foundation is solid, the utility lines are in place, and the permits are approved.

    Before you dive into the next big platform, here are the critical “pre-work” areas to investigate.

    1. Is Your Data & Infrastructure Truly Ready?

    The focus has shifted from basic compatibility to holistic readiness. Before you can leverage AI, you must assess the entire data pipeline that will feed it. “Garbage in, garbage out” has never been more true.

    • Hybrid and Multi-Cloud Connectivity: It’s no longer just about connecting to Azure or AWS. How will your on-premise data centers, private clouds, and other SaaS platforms securely and efficiently interact with your new analytics environment? Can your existing network plumbing and VPNs handle the sustained, high-volume traffic required for model training and large-scale queries?
    • Data Quality and Lineage: Where is your data coming from? Is it clean, validated, and trustworthy? You must have a clear understanding of data lineage to ensure your AI models and analytics dashboards are built on a foundation of truth.
    • Vendor and Firmware Audits: While less of a blocker than in the past, it’s still crucial. Check with your network and hardware vendors to ensure your core infrastructure isn’t running on unsupported legacy firmware that could create security or performance bottlenecks.

    2. Have You Aligned the Human Layer?

    Technology projects are people projects in disguise. In a complex organization, multiple teams, vendors, and departments have domain over the various pieces you’ll need. Ignoring this is a recipe for delay.

    • Identify All Stakeholders: Go beyond the IT department. Map out every team involved: the on-prem infrastructure team, the cloud networking group, third-party managed service providers (MSPs), departmental data owners, and cybersecurity. Understand their individual timelines, priorities, and SLAs from day one.
    • Assess the Skills Gap: Do you have the right talent in-house? The skills needed for 2025 go beyond SQL. Do your teams understand FinOps, MLOps, prompt engineering, and the specific architecture of platforms like Microsoft Fabric or Databricks? Plan for training and upskilling early.

    3. Have You Mastered Cloud FinOps and Governance?

    Cost management in the cloud has matured into a formal discipline known as FinOps (Cloud Financial Management). It’s about creating financial accountability for your cloud usage, and it’s non-negotiable for any at-scale deployment.

    • Understand the New Pricing Models: The billing landscape is more complex than ever. Are you paying per hour, per query, per user, or based on consumption (like tokens for a Generative AI model)? Your solution will likely involve a mix of these. Work with procurement upfront to ensure your vendor agreements cover these new services and that you aren’t exposed to runaway costs.
    • Empower Accounts Payable: Your AP team is likely accustomed to fixed, predictable monthly software bills. Introduce them to the concept of metered, variable billing early. Provide them with forecasts and explain why costs may fluctuate month-to-month to avoid friction later.
    • Implement Cost-Saving Measures: Are you leveraging cost-saving features like reserved instances for predictable workloads or auto-scaling to shut down resources when not in use? Building a cost-aware culture and automating these practices is key to maximizing your cloud ROI.
    • Establish Robust Data Governance: Who has access to what data, and why? How will you manage access control as data flows into these new, powerful AI-driven platforms? Define your security policies, data masking rules, and compliance frameworks (e.g., GDPR, CCPA) before you migrate sensitive information.

    The promise of AI-driven analytics is real, but it’s not magic. Its success is built on the diligent, often unglamorous, work of preparation. Don’t let the latest buzzwords short-circuit your approach. By focusing on these basics, you move beyond the hype and build a data strategy that delivers lasting value.

    Author: Jim Fahrenbach

  • The Rise of Microsoft Fabric in Data Platforms

    A recent discussion on the history of data platforms revealed a powerful pattern: the corporate data landscape has been shaped by two great waves of commoditization, and the second is cresting right now.

    The First Wave crashed over analytics in the 2000s. Specialized BI tools like Tableau and MicroStrategy offered powerful, yet expensive, insights. Microsoft then disrupted the market with Power BI, leveraging its scale to make sophisticated dashboards a low-cost commodity.

    With analytics solved, the bottleneck shifted to complex data engineering and storage. This created the Second Wave, giving rise to specialized platforms like Snowflake and Databricks. Today, history is repeating itself. Microsoft Fabric is accelerating this second wave by unifying the entire data estate—from storage and engineering to BI and AI—into a single, low-code platform.

    The market’s hunger for this integrated, cost-effective approach is undeniable, evidenced by Fabric’s rapid adoption by 25,000 companies, including 67% of the Fortune 500. By commoditizing the underlying data infrastructure, Fabric allows organizations to pivot from manual data wrangling to the next true differentiator: integrating AI across the enterprise. It’s a remarkable evolution, especially considering that two decades ago, a major technical win was just centering a P&L table correctly on a webpage.

    #MicrosoftFabric #DataStrategy #AI #fabric #data

    https://www.microsoft.com/en-us/microsoft-fabric

    Author: Jim Fahrenbach