Category Archives: Technology

What can I.T. learn from GDPR

So, GDPR has been in effect for just over a year now. (15 May 2018). Was there a celebration in your organization, or is GDPR a bad word? 🙂 )

Many lessons have been learned through the effort of bringing our systems into compliance with GDPR.

I think we can start to generalize from those lessons to identify how our own I.T. systems might benefit from this effort.

Generally, I think there are four broad and very generalized steps to bring your systems in line with GDPR – and to lead to better compliance over all.

  1. Understand what data you’re holding and where it is stored
  2. Collect and organize your data so it becomes an asset.
    • If the data is managed, and it becomes an asset, you can better determine the effort and value – or the expense – it takes to hold it.
    • bringing it in from extraneous storage locations is important. If you don’t move the data, cataloguing it can serve the same purpose. You must know where it is stored so that it doesn’t become a forgotten silo with unmanaged and broken permissions and access controls to surprise you later.
  3. Determine which services related to compliance with the regulation or management of the I.T. system you can and should centralize.Which monitoring systems do you need to identify changes, updates, access requests, delete requests, and other operations requested on the data?
    • Which services are needed to return current status and transactional logs related to the data and these requests?
    • Which trigger and alert services do you need to support to notify you of items that are trending in unexpected directions or are out of compliance?
  4. Understand which applications are accessing that data and prepare a plan to replace them with compliant applications, eliminate those applications that will be too expensive or risky to maintain, or modify the applications so they can be compliant
    • Finding a way to integrate your corporate applications with the centralized system of monitoring and compliance can provide you with a single source of application services related to the regulation and flexible applications that can more easily be modified to support future regulations. (e.g. do you do business in the state of California?)

This can take some time. It can take some patience, and it will require a prioritization effort to determine which project to modify first, etc. Perhaps there is someone in your organization that remembers the Y2K effort? Ask them about their lessons learned from that prioritization effort. 😊 I’m not suggesting that the implied urgency is the same…  or am I? We still have time before GDPR affects our own little I.T. group, right? Well, maybe not, and perhaps there is some urgency here.

Discussing this path to compliance can be useful. The first step toward being able to achieve a journey is to take the first step, of course, but having a map brings so much efficiency to the effort and saves so many wrong turns.

Use this map to compliance and see if you can liken it to your journey to improved collaboration, or deploying MFA and secure authentication across your organization, or any other I.T. challenge that is in front of your team this quarter.

Where are your data silos hiding?

Chredge is here

I just tried Microsoft Edge built on Chrome and, so far, it is marvelous.

I listened to a Windows Weekly episode this morning that was a week old.

These orders are seven bloody hours old!

It was Windows Weekly episode 614. I fully enjoyed the first half (then I arrived at the office), and it was mostly about Chredge. (“It was so choice. If you have the means, I highly recommend it.”)

Tonight, I went to the Microsoft Edge Insider site, downloaded the test build for the Windows Edge built on Chromium, or “Chredge”, build, and installed it on my Windows 10 box.

It installed flawlessly, imported my Chrome settings, for my primary Chrome account, and allowed me to create user profiles, like I’ve been using for too long on Chrome. (Queue the Hallelujah Chorus!)

My constant pleadings to the browser buddhas have been answered. User profiles in Chredge.

I think that – for a moment – I felt just as Riley must have felt when he observed “Stairs” in ‘National Treasure’.

Please download and enjoy the goodness. I’m going back now to test extensions. (Mary Jo told me that they are working!) (OK, so she didn’t tell only me, but she mentioned that in the WIndows Weekly podcast episode I linked to above.)

The Activity Store is the New Source of Truth

The search for a document management solution wasn’t really being undertaken by the right people until you were asked whether or not it could either

  1. Serve as the Source of Truth, or
  2. Respect another repository as the Source of Truth

Life used to be so simple back then. At least, that was how it seemed. It turned out, that even the records management solution was never The Source of Truth – it was only ONE Source of Truth within the organization.

We will forever have to manage many Sources of Truth within our information farms and fields and domains.

It was always easier, though, when we could roll up multiple systems or at least similar systems, into a single Source of Truth.

Printed circuit board futuristic serverThe lingua franca of our information domains has changed, though. It’s no longer documents, or fragments, or reusable content. It is not even the lowest common denominator, the log file, any longer. It is now “Activity”. If that sounds like a vague term, it is only vague until you define it. Create a schema for it, and then every system can feed and consume “Activity” using JSON or other protocol (I’ll bet you still have plenty of systems/applications using REST or Javascript APIs or even .CSV files, don’t you?) and your systems can start to build on each other again.

Microsoft 365, primarily through the use of the O365 Audit Log and the Microsoft Graph, is providing an Activity Store that will power the next generation of applications.

Microsoft Azure and Amazon Web Services provide solid tools for managing and tracking the IaaS portion of your data center. But there is no comparison to the Audit Log and the Microsoft Graph for making sense of what is happening within your Digital Workplace.

As you build applications, you should be dropping hints, if not writing explicit updates, into the Log and the Graph. This is not only for ISVs, but also for the new breed of custom developer, or even power users.  You can choose to make your app yet another Source of Truth, but if you want your data and function to be incorporated into one of the new style of Digital Workplaces, then you may be better served by leveraging a central Activity Store.

The Activity Store is the new Source of Truth. Are you leveraging it yet? If not, your competitor might be.  There will always be many Sources of Truth. But buyers will be asking if you are yet another, or if you are able to leverage the one that they have.

Thoughts on SharePoint Conference 2014

My thoughts on the Microsoft SharePoint Conference 2014.  The event was a great success, held Mar 3-6, 2014, at The Venetian, in Las Vegas.  Compared to previous SharePoint Conferences, SPC14 was held at a bigger venue, it was easier to get around in, the exhibition hall was more wide open, the keynote speaker was about as big a deal as could have been had (Would Hillary have been a bigger coup?) and The.WIFI.Worked.  A tremendously successful conference.  The SharePoint and Office 365 teams at Microsoft deserve to take the next 36 hours off, perhaps attend the Las Vegas NASCAR race, and then bask in the success of their conference.

The new product announcements show some good new direction. I’m particularly excited about

  • the New Office 365 APIs that bring Office 365 closer to parity with SharePoint Server 2013 as a platform for business applications (,
  • the new content enablement that is provided for PowerPoint, Excel, and Outlook applications (
  • Improvements to Power BI across the board and the release of Power Map. (Even though this was announced a month ago, I’m adding it here (
  • The concept of Working Like a Network ( It will take a while for this to roll out, but the application ideas are already starting to roll around this one.
  • And the biggest one, for me and my new company, BluLink Solutions, will be the patterns for migrating Trusted Code solutions to the App Model (

SPC14 was the 5 year reunion for those attendees of the tremendous and inimitable SPC09 conference, and it was the 10 year anniversary of the launch of SharePoint 2003, which started the enterprise-wide push and where “SharePoint” started to find its legs, as it grew into MOSS 2007 and SharePoint Server 2010.

SPC14.InfoPathFuneralOne of the things that makes for a great reunion is a strong community.  I consider myself lucky to have been able to observe the growth of the community and the depth and breadth that it contains now is fantastic.  No longer can anyone single group, or collection of groups, control, manage or “provide direction” to the community.  There are many groups within the community, and the overall group is large enough to support new groups as needed.   I attended the meetings of the MSFT Technical Communities and the SharePoint Saturday leaders and the SharePoint User Group leaders and some very useful coalitions are building helpful tools to support different locales and groups of different sizes. I heard that the Women in Technology (secondhand, as I wasn’t on the list) group overflowed its planned meeting space and that is tremendous. Impromptu activities such as the funeral procession for InfoPath, small groups such as SP FitBitters and SPRunners, or around evening events or side trips to local attractions, and countless others, are great examples of where the community is diverse enough to take care of its own.  This is so encouraging.

On the other hand, it does mark the maturation of the SharePoint community and marks the time when messages will be more and more difficult to ensure are delivered and received accurately.  The good old days of one person being able to understand all of SharePoint is gone. One person can now understand most of SharePoint, and can track most of what is goingon through diligent following of multiple blogs, RSS, and twitter feeds, but that can be difficult to maintain when we also have to work…

The early three social kings of SharePoint have changed, as well.  Mark Miller (@EUSP), it seems, has moved to greener pastures, Joel Oleson (@Joeloleson) will always continue to drive a large group of followers as illustrated by his leading a procession in a Monk’s robe through the conference, and Jeremy Thake (@jthake) has now moved to join Microsoft in leading developers to new depths. No longer are the three of them driving the community audiences (Devs, IT Pros, End Users) in a coordinated broad direction.  True, the trio has been divided for a little while now, but I think SPC14 marks the official passing of the torch back to the community as an entity.  Even at SPC12, the community booth efforts were spearheaded by a group with these three providing much of the guidance.

Moving forward, though, I think that SharePoint as a whole is too large for a single group of friends and workers and associates to be considered the leaders.  Each of us has our own path to carve out of the world of business solutions. The relay baton has been passed. SharePoint has grown up.

Where will you, as an attendee of SPC14, shine your light?  Whatever you are working on, share it. When you come up with a best practice, or a new approach to using OOB features combined in a unique manner to provide new functionality, let others know.  As you see your companies using SharePoint as a platform for new vertical applications and to support solid business processes that have been rebuilt to mash up data in a new way and expose it to new business groups who couldn’t access it before, share what you see! Talk about the impacts, and help other groups realize the potential locked within their SP OOB mentality.

SharePoint Friends Don’t Let SharePoint Friends Work Only with OOB Functionality.

I had a great time at SPC14 and I hope that all of you did, as well.  If you didn’t, let MSFT know. If you did, let the community know!    I can’t wait to see everyone again next time.

SharePint at WPC12

One of my favorite weeks of the year is coming up – the Microsoft WorldWide Partner Conference.  One of the best meet-ups of the week has always been the SharePint event.  This year should be no exception.


This year, the Microsoft SharePoint Marketing Group has worked with Pingar and 3 other software companies, Axceler, Rackspace, and Idera, to host a meet-up for partners that work within the SharePoint ecosystem during the week of WPC12.

You know what they say…  SharePoint by Day, SharePINT by Night!

This year SharePint will be on Tuesday, July 10, from 6-8PM at the Madison Avenue Pub, in Toronto.

WPC is a huge event, and while there are some important sessions for SharePoint partners, the real significant effort at WPC should be about meeting with other partners and working to grow your company’s network and connections.  I think this is why the WPC Connect portion of WPC has grown to be (at certain times of the week) the busiest part of the conference.  While it can be hard to find open time to meet with specific partners, at least SharePoint partners understand where they can meet their SharePoint peers and enjoy some good conversation.

If you haven’t already registered for WPC12, please do so at

I’ll be meeting with partners at WPC Connect, attending a couple of the sessions, and hoping to meet everyone at SharePint!  If I haven’t already reached out to meet you, please reach out to me and let’s meet at WPC12!

I’m certainly looking forward to an amazing week in Toronto.


SQL Server 2012 switching to Core-Based Licensing

I found this news item from Directions on Microsoft, “SQL Server 2012 Adopts Per-Core Licensing Model” interesting.

SQL Server 2012 now requires processor core-based licensing for SQL Server 2012 enterprise edition, and core-based licensing is one of two types of licensing available for SQL Server 2012 standard edition.

For about 6-7 years now, ever since Oracle started charging for processor core, Microsoft enjoyed an easier licensing conversation because they licensed per processor, and not per core. I used to sell Microsoft technology, and had to answer licensing questions often about how their products were licensed, and was glad that Microsoft was only charging per processor, and not per core. It felt, at the time, that Microsoft wasn’t trying to penalize people for using the latest and greatest CPUs (which then were arriving with 2 cores, or 4 – of course, now, there are many more cores).

How times change. Apparently, Microsoft isn’t concerned about competitive licensing scenarios with Oracle any longer. I think that is probably a good thing for Microsoft. It probably also means that Microsoft’s internal models probably identify that they have been leaving money from customers on the table, and that moving to a per-core license will be able to extract a little bit more from customers than the per-server licensing model. All’s fair in product licensing?

Redmond.mag quotes Wes Miller, an analyst with Directions on Microsoft, as saying that a single Enterprise core for SQL Server 2012 will have a list price of $6,874 per core. These are only sold in two-core packs. A server can be partially licensed or fully licensed. A fully licensed server requires a minimum purchase of 4 cores. Of course, volume licensing customers and customers with an Enterprise Agreement and Software Assurance will have significant discounts off of the list price.

I do like the flexibility of the licensing model to allow customers to move licensed cores from on-premise to hosted cloud providers and back again.

I was pleased to see the analysis on

The actual cost for EE is roughly the same as if you licensed 2 sockets of SQL 2008 R2 Enterprise Edition as long as it had 4 cores per CPU. The cost goes up as soon as you start using 6 core processors and above. The prevalence of 4 core processors means this likely won’t change much for many organizations.

Compared to SQL 2008 R2 Datacenter, however, there is a large cost difference. Datacenter costs $54,990 per processor or over $100,000 to license a 2 CPU system. You can now essentially get the benefits of Datacenter Edition (unlimited virtualization rights, etc.) for half the cost you would pay in SQL 2008 R2.

Even with this new licensing model there are still huge cost savings to be had by licensing all cores of a server and virtualizing your SQL 2012 workloads. It’s hard to argue with unlimited virtualization rights especially for those lightly loaded SQL workloads.

I wonder if this model will also fall through on the upcoming next version of SharePoint Server licensing that will be out sometime this year or early next year. My guess is that this will also apply to the next version of SharePoint Server. (I have no insider knowledge of this, this is just a guess.)

Directions on Microsoft report

Microsoft SQL Server 2012 licensing page

Redmond.Mag: SQL Server 2012 to Bring Some Price Hikes New SQL 2012 Licensing and its Impact on Virtualization

CloudShare for Demo Environments

I’d like to tell you a story.  One of the software development companies that I’m working with, Ooyala, is building an integration with SharePoint.  In April (2011), we were planning to have the public unveiling of the product at the NABSHOW in Las Vegas.  It is a huge convention, and we had secured a demo kiosk and were planning to get enough of the product developed so that we could illustrate how the final product would work. 

One of our development teams was in Argentina, another development team was in Redmond, WA, and Ooyala has headquarters in Mountain View, CA.  We needed a development and testing environment that would enable all three groups to work together and to share an environment in preparation for the show.

For a couple of months, we had been using a development environment that was provided by  And everything had been working well.  We would bring up the CloudShare environment when we needed to, and share access to it from city to city.

One of the useful features that CloudShare provides is the ability to recreate a SharePoint environment from one a list of starting environments that they provide.  When one of our development teams release a new build, we can recreate a new environment on CloudShare and test the installation and configuration, etc.  It was working very well.

If you haven’t tried out CloudShare yet, you can find them at  Look for the “SharePoint in the Cloud” selection right on the front page.  That page will describe the SharePoint environments that they offer and how you can utilize them.

As we got closer to the NABSHOW, however, something wasn’t quite right.  I was worried about little things, like the URL used for the demo.  We also wanted to share the demonstration version with other potential partners and customers, so we wanted to create a demo URL (i.e. NABDEMO.OOYALA.COM) and have the dev/test environment support that.  I couldn’t find the right way to make that happen for the CloudShare environment that wasn’t located within our data center, but a quick call to CloudShare answered the question.  (They have since cleaned up how the features are displayed so that this particular feature is easier to find.)

The problem that I was running into is one of the features of CloudShare that allows their service to be affordable.  When a CloudShare environment times out due to a period of inactivity, the environment is automatically suspended, and the assigned IP addresses and physical hardware can be re-utilized for other users and to meet additional demand.  Sound very cloud-centric, right?  Well, yes, actually.  However, when the environment would be re-activated the next time someone needed it, the server machines in the environment might be assigned a different collection of IP addresses.  This meant that I might have to keep updating the DNS entries for the URL that I wanted to use for the demo.  Difficult and inconvenient.

But then I learned about a great little feature called “Always-On”.  This costs more per month, but enables your environment to stay up all the time, maintaining the same assigned IP addresses.  A quick call to CloudShare operations, and we had the solution, we had a constant IP
address for the DNS settings, and we were starting to demo using an easy to communicate and an easy to use URL.  I’ve learned, over the years, to never underestimate the importance of having an easy to communicate URL when working with salespeople…

So, long story short, we unveiled the product at the NABSHOW in April – the demo environment was flawless, and we are still using CloudShare as our development environment as we enter into a full Beta stage with the product here in June.What's your video strategy?

If you’d like to learn more about CloudShare, please learn more at

If you’d like to learn how to bring the video capabilities of Ooyala into your SharePoint environment, please send email to

If you’d like to take me to a Seattle Mariners game this summer (Yes, I’m back on the bandwagon!), please send me email –