Could LinkedIn Be a Global Corporate LDAP?

Who is this article for?

Technologists, investors and business professionals that see opportunity in near field communication will find interest in this article.  The primary purpose of this article is to discuss the possibility that some social media services could replace corporate LDAP servers with the expected widespread adoption of NFC.

LDAP – A primer

Quickly, if you’re tech savvy…skip this section.  If you’re just looking for a base explanation of what LDAP is then go here.

If you want a deeper dive then go here.

In most corporations when we talk about LDAP we’re talking about the storage system that handles your user account, password, and credentials that allow you to access a myriad of applications you use every day.  LDAP serves other functions for identification of network assets, but for our purposes, in this article, we’re primarily concerned with how it is used for authentication & identification.

LinkedIn as an LDAP Server

Take a look at your LinkedIn account for a moment.  What is it?  It’s a place to network.  It’s a place to find a job.  It’s also a place to promote yourself and a place for you to follow your professional interests.

But beneath all this is something more fundamental: your identity.  Your professional identity to be sure…but for most of us that pretty much sums up about 80% of our life.

Uniquely, among the social media darlings, LinkedIn has a bias towards professionalism.  It targets industry professionals in a way that Facebook and RenRen don’t.  One way to see LinkedIn:  a global corporate LDAP server.

Sound far fetched?

It shouldn’t.  Check out the number of sights now allowing you to login with your LinkedIn account on the web.  LinkedIn has groups with administrators who can assign access and grant permissions to members.  LinkedIn has a ‘contact’s section….essentially an address book.  It also has a way to send messages to those contacts, essentially mimicking email…which is usually one of the traditional applications that use an LDAP server.  So what prevents LinkedIn from being a corporation’s LDAP server?

The Catalysts

Another way of asking the question:  what technologies would make it possible?

Visualize this:

You go to your new job sometime in 2013 and and visit the Human Resources department to get all  your paper work squared away.  Good thing you brought your mobile device.  Your LinkedIn app has all the key information the corporation needs to get you setup.  You tap the NFC reader in front of you, type your pin code and the HR rep sees you added into the XYZ corporation’s domain on the LinkedIn website.   She then gives you the necessary group permissions to access your applications, computer and building location.

When you get to your building you tap the NFC reader near the door and it verifies with LinkedIn that you’re in the right group to access this building.

When you get to your desk, you tap the NFC reader on your laptop to login.  It uses your LinkedIn app and the unique key associated with you to check with LinkedIn to determine who you are, what permissions you have and whether you have access to this computer.  You’re logged in and your data, and applications are ready to be used.  Best of all Google Chrome uses your key and passes it to sites that are enabled with LinkedIn.  It only passes  safe data ( your first and last name ) unless you authorize and choose more sensitive data to pass.

The Implications

This vision builds on a previous set of articles I wrote  ( The Externet and  NFC is about the eWallet Right? ) that show how pervasive the NFC revolution will be.  It has the potential to shake up many traditional industries,  and create opportunities that are not truly seen at the moment.  If you’re Microsoft, IBM, Google, Apple, or Oracle…then the vision I mentioned above has deep implications and potential opportunities for your software and operating systems.  Those who capitalize on this first, could, potentially, overtake the corporate enterprise market.  Knock knock….Ballmer, are you listening?

In some respects Apple understands this vision.  If you attended the last WWDC then you know what I mean.  Steve made it clear that Twitter would play a bigger role in Apple’s software.  But why?  Was he just acknowledging the popularity of Twitter?  Is he grooming Jack Dorsey to be his successor at  Apple?  Maybe.  But Twitter could also be seen as a repository of users.  A social LDAP server.

Facebook too sees this, but their database is…well, unprofessional, for the most part.  It identifies people to be sure, but you can almost think of LinkedIn and Facebook as the two sides of  our lives.  Our professional life is represented by LinkedIn and our personal life is represented by Facebook.  I wouldn’t want to go to a party and show my LinkedIn profile: that’s boring.   At the same time I wouldn’t want to present my Facebook profile to a company I wanted to work at. So Facebook’s opportunity with authentication and identification is with the personal market.

Summary

Obviously for LinkedIn to be a global corporate LDAP server other software in the enterprise needs to use this.  But that’s exactly where a company like Google would come in.  By making Chrome the OS, utilizing Google Docs for all your key enterprise work applications, gmail for your messaging and LinkedIn as your LDAP instance….you now have an enterprise ecosystem that other software providers can build on.

If such a vision becomes real then a likely loser in this race is Microsoft.  Their long unchallenged position in the corporate market could come to a screeching halt.  This would come as no surprise to many of us in the industry.  Microsoft has been languishing for almost a decade and seems utterly without vision.   Their recent overpayment for Skype and their replication of Google’s search engine as ‘Bing’ was a clear sign that the imagineers are no longer running the company.

Despite most people’s opinion that Facebook is the strongest of the social media providers…I think, through this article, I’ve proven there is a latent, but very large, potential for LinkedIn that it’s current market capitalization may not adequately represent.

About the Author

Christopher R. Goldsbury is a software development professional who has played the roles of developer, architect, scrum master, development manager, project manager and quality assurance manager  throughout his career.  Chris writes on his experiences and ideas at his blog: http://www.anagilestory.com.

Advertisements

Intuit Helps Bridge Gap Between Credit Cards and NFC

The bridge between near field communication mobile payment technology and the credit card world for Android:

http://goingnfc.com/intuit-gopayment-nfc-concept-at-google-io-in-san-francisco/

Decline of Written Requirements

Introduction
Requirements managers, project managers, and business analysts will find the most interest in this article. Developers will find a nice challenge at the end.  🙂   In this article, I attempt to show that written requirements are no longer necessary….and a new tool, using new media….is necessary.
Why did written requirements fail us?

Requirements gathering, analysis and management has never been easy.  It’s hard work filled with nuance, half-truths, mis-interpretations, ulterior motives, impressions, imagination, emotion and misunderstanding.  It’s vision crafting.
Getting everyone on the same page to a clear and sufficiently detailed model of the proposed system requires a conductor with the right skills.  Too critical to be ignored, a lack of depth in this area can doom a project.
To help codify that vision the software industry quickly turned to documentation and documentation standards.  But, written requirements have proven to  be an inadequate fit with the abstract nature of software and these flaws became apparent:
  • “That’s not how I read it”
  • Versioning Changes
  • Big complex software = big complex requirements = never read/hardly understood
It was realized by a myriad of software professionals that relying exclusively on the documents and omitting en
gaged conversations to clarify the written word was a mistake. But….spoken requirements and conversations have these flaws:
  • “That’s not what I heard or understood.  That’s not what I remember.
  • Hallway conversations, informal clarifications that aren’t heard by all parties.
  • Complex requirements can’t easily be remembered, conveyed through conversation alone.
  • Change management is non-existent.
User stories attempt to blend the spoken and written worlds and emphasize the continuous interaction between customer and software development team.  They rightfully restored the need for conversations and engaged business domain involvement and got us away from the never ending refinement or versioned requirements books.  They get us closer to where we need to be.  But, they still miss the mark.  Some flaws of user stories and their crafting are becoming apparent:
  • Software and business teams change…..leading to a loss of highly valued historical domain knowledge that may or may not be documented in the code or the story.
  • An awful lot of time is wasted writing these down, editing them, and then clarifying the exact meaning.
  • It’s a fine line.  How much to write?  How much to discuss?  Too little and the story is worthless.  Too much and we’re back to writing all our requirements.
  • Complex business domains and logic need to be documented in detail.  A simple story ( or many simple stories ) with acceptance criteria may not be enough.  Think about requirements for the Boeing 787 flight control system.
  • User stories assume the business owner will be available ( static ) for clarification during much of the project: not a safe bet.
Where are we headed?

With the proliferation of digital video, cell phone cameras, webinars, social media, and whiteboards we have a new set of building  blocks to create the 21st century requirements management tool.  And that is my challenge to the community.  Build it.  Use these pieces to create a new tool for managing requirements that further reduces the risk of interpretation.  Here are some advantages I see:
  • The historical integrity of that requirement conversation ( and its vision ) would be preserved for any future teams to review and understand it….nuance and all.
  • Verbal as well as non-verbal communication would be preserved and could be analyzed later for better realization of the requirement.
  • Whiteboarded artifacts could easily be recorded and saved along with the video.  Adding to a thorough documentation of the conversation.
  • While this wouldn’t completely eliminate the need for clarification of requirements; it could sharply reduce it.
  • Productivity in requirements gathering  and recording could potentially jump helping out the entire software development process.
If you feel you’re up to the challenge…:)….then contact me ( chris@effectivelogic.biz ) and I’ll give you further guidance requirements.

About the Author

Christopher R. Goldsbury is a software development professional who has played the roles of developer, architect, scrum master, development manager, project manager and quality assurance manager  throughout his career.  Chris writes on his experiences and ideas at his blog: http://www.anagilestory.com.

The Bad Attitudes of Agile

Who is this article for?

All software development professionals will find interest in this article, but managers, CIOs and software architects will find the greatest interest.  The topic may be controversial to many, but I offer this article as insight into what seems to be a growing problem in the Agile movement.

“Why are you here?  Agile doesn’t need managers.”

Ever hear this one before? Imagine how shocking it is to hear that the developers think your position shouldn’t exist……as if you as the manager had some contribution in creating that position.   It’s most commonly directed at project managers as they first meet the development team they’ll be working with.  To be sure the original Agile Manifesto makes absolutely no mention of project management and subsequent agile theorists go further and suggest adjusting the project manager role to be more of a coach or support role.

However, this view ignores reality.

Small non-integration dependent development projects, to be sure, probably require very little supervision of any kind as long as you have a competent, experienced and capable team.   However, the larger the project, the more integration dependent the project, and the less development centered the project…….the more a project manager is needed to coordinate, communicate and lead the overall effort.  A project in which the development portion is only 10% of the overall budget can allow a scrum master to lead the development while reporting to the project manager.

Furthermore the development team is almost never aware or good at managing a budget.  The amount of time required to develop software requires that little time be spent on anything else.  This creates a bit of a blind spot for some developers as they begin to believe that everything they are doing *IS* the project and that anyone else is just a peripheral annoyance.

The bad attitude here is the inability to recognize other roles and professions as having value and strictly adhering to a philosophical interpretation without recognizing the need for flexibility given the winds of reality.   If taken too far the attitude can come across as almost unionist or neo-communist in its presentation by extending the view to all management in all situations.  Surely the individual who adopts such an all-encompassing wholesale reduction of the corporate culture and organizational structure into one flat level is a radical.   His views are on the periphery, but if he’s the right person ( a leader ) his views can gain traction and worsen the relationship between the development team and management; turning the goal of project completion into a class warfare between management and workers.

“The team runs the project, not the managers…….we’ll decide what gets done.”

This view is often an outgrowth of the notion that management roles are no longer needed.  It flies in the face of the truth; that many decisions require collaboration among many elements of the company; not just the development team……..including the software design and architecture.

In other instances developers positing this notion are just unaware that there are other aspects to a project.  Or even worse a developer has been burned terribly by a bad experience and feels the need to “take control” of the project before some perceived breakdown occurs.

Regardless agile becomes the pre-text,  a foundation, for an attitude which suggests that most, if not all, the management structure above the development team has no contribution and should be summarily removed from contribution to the effort.  Letting this attitude take hold, in my experience, usually results in endless re-architecture sessions, severe budget-overruns,  no real end date, and a fractured emotional team that becomes disillusioned with its own mission.

“There are no due dates or schedules in Agile.”

Those of us with deep insight into capital budgets and corporate finance know how silly this is.  However, if you read Ken Schwaber’s Scrum book it does talk of abandoning the Gantt chart for a burn down chart.  In truth the burn down chart is a neat and well thought out innovation, but there are those who take this to mean that there is no schedule for delivery……i.e. the money never runs out.

This was a painful experience for myself.  I watched as a team led by a strong, charismatic technical leader ,whom we all reported to, abandoned any time based goals in favor of just producing a “working product” for the customer.  Without any time boundaries the team careened every which way.  Work ethics declined or were non-existent.  Those who wanted the product to succeed lost any motivation and drive.  The customers became bewildered as to why so much emphasis was being placed on various technical architectures while features and product change requests became lost.  The burndown charts further confused them.  All they really wanted to know was; when will the product be complete?  The team would only respond with; “We’re not on a schedule. We keep developing until we’re done.”

 When attempts were made to set realistic goals by anyone; they would immediately get knocked down as “anti-agile”.   When the team was informed that their project was hopelessly over budget; their eyes looked clouded, and confused.  The connection between what they were doing, time and, ultimately, money had become lost in the abstract design patterns etched on their whiteboards.

Realistically…….there is always a due date and a schedule for delivery; explicit or implicit.  No one puts up money for a development project with the view that it will never complete.   Even more realistically, I’ve found that Gantt charts are still very useful for coordinating deep integration or non development aspects between non-agile teams and agile teams.

The no schedule attitude arises mostly because agile techniques put forth the notion that the project should continue to add new features until the money runs out.  This is ideal and ignores what happens when a development team hasn’t even completed the bare minimum of functionality within the budget; rendering the application useless.  The bad part of this attitude is taking a new technique for tracking team progress and accountability and twisting it into a reason for not being accountable for delivery.

“Agile code is self documenting.  There’s no need for requirements, architecture diagrams or technical specifications.”

If you are a software architect or technical manager this attitude is usually targeting you between the eyes.  The thinly veiled attack is meant to question your role, experience, and the need to have anyone coordinating the overall technical design of that 28 million line software program that generates 78% of the company’s revenues.

Certainly it is often put forth by ignorance.  Maybe the 2000 line web app that the developer built recently required very few artifacts beyond the source code, but scale matters.   You know that, your management knows that, but this bad agile attitude chalks up your role to not staying current on development techniques like Scrum.  Major software systems require that a few minds are overseeing the direction and coordination of the technical vision and the many hundreds of hands creating it.

In my own experience this attitude came from a developer who, ironically, wanted to be one of the architecture staff.  He felt by critiquing and arguing with the technical leaders and introducing his knowledge of agile techniques they would respect him more and give him the coveted position he craved.  Instead, they found him to be annoying and a troublemaker.  Furthermore his lack of tact in introducing agile concepts left the senior technical leaders with a bad taste in their mouths for anything agile.

“Agile rapidly embraces change; all change.”

My experience with this attitude came from a manager instead of a developer.   It turned out he read “rapidly embrace change” to mean all kinds of changes……not just business requirements as was intended by the original agile creators.  So, fundamental architectural changes became commonplace and shifting between different open source technologies was seen as ‘good’  even though this meant taking the team completely away from their skill sets and setting project delivery back by months.  Organizational experimentation and rapidly dropping people in and out of roles also became part of  ’rapidly embracing change’.   The end result was a mess.

Clearly accepting change presented by customers is important, but without a system for managing that change; you’re asking for trouble.  One needs to keep track of all requirements and changes and their impact to project delivery so that this can be communicated to customers.  This is necessary to make effective project decisions.  If you don’t then the customers get the unrealistic notion that anything they ask for will be included……we know where this leads.

So the bad attitude here is accepting change without managing it.  An unmitigated free for all will only lead to dashed hopes and unmet expectations.  Change is good, but violent change is chaos.

“Agile uses generalists; we test our own software.  There’s no need for a QA group.”

Again this view is accurate in philosophical interpretation but my experience with this on, especially large, software development projects is…….you need a 2nd set of eyes looking at what the developers created and how well it works.  Pride of workmanship is great and should be fostered, but sometimes pride can turn into blind acceptance and defensiveness.  It takes a strong and deeply honest person to recognize their limitations and find ways to mitigate them.

Using generalists puts emphasis on making sure you’re staffed with a nimble group of multi-skilled individuals.  In reflection this recognizes software development as mostly craft and less production assembly.  However, as software development leaders we can’t assume perfection in human resources and ignore the facts.  It’s better to see the risks and plan for them and history has proven that developers don’t find all their own mistakes.

In my own experience the individuals holding this view disliked anyone testing their code and were prickly to any constructive criticism.  In one case in particular we found the underlying reason was because the developer really wasn’t that good at coding.  He was given training and mentoring and after many months of struggle it became clear that he was on the wrong career path. 

So using generalists is fine, but the attitude becomes stale if the hard truths of decades past are ignored in favor of philosophical purity.

Summary

In conclusion these problems could be found in the pre-agile world as well.  But in my experience these bad attitudes are finding refuge and justification in a new technique that in isolation probably never intended to present such a soapbox.  As software development leaders it’s critical that we address these viewpoints before they take hold of the agile methodology and potentially darken a good movement.   Agile has a great message; simplify, engage the customer during the product development, take ownership, and stay connected.  It would be a great disappointment to see this message lost.  So what do you think?  Are these attitudes in your shop?  How do you address them?  I’d like to hear from you.

About the Author

Christopher R. Goldsbury is a software development professional who has played the roles of developer, architect, scrum master, development manager, project manager and quality assurance manager  throughout his career.  Chris writes on his experiences and ideas at his blog: http://www.anagilestory.com.

Agile Finance – Story Point Cost

Who is this article is for?

This article is written for those with management and budgetary responsibilities for a software development project or team. Others, including developers, quality assurance personnel, and CEOs/CIOs may find interest.

Why would we need to estimate story point cost?

Story points are used to estimate work. Investment in that work is expected to derive some benefit. If that benefit is expected to be financial then understanding the cost of that work is essential to deriving any meaningful ROI. Even if no ROI is expected and the intended benefit is regulatory compliance ( as an example ) then company leadership usually wants to understand what how much of their limited financial resources is going towards any specific feature, iteration, or release.

How do we do it?

The technique presented here is a historical parametric approach. It relies on past data from previous projects. So, one has to have some of this data saved up before a reliable figure can be derived.

RC = Total dollar cost for a historical releases in a product

RSP = Total story points that contributed to that release.

RSPC = Release Story Point Cost

RSPC = RC/RSP

Once you have this for one release you should calculate it for all historical releases. The next calculation is an average:

Average RSPC per product = ∑ RSPC¹, RSPC²……..RSPCⁿ / N

If you want the story point cost across all products then average it again. Although, for most planning purposes it’s useful to plan by product line and this higher level of abstraction of cost might be too watered down.

What questions does this help answer?

  1. How much will it cost to add this feature?
  2. How much will it cost to deliver release 2.1.0 ?
  3. What is the cost of an average iteration?

How often should it be updated?

The astute among you will notice that we’re using historical data. Historical data is only accurate as long as change doesn’t take place. To counteract the shift and change in time size, capability, and mix one needs to do these calculations at regular intervals. How often? This is a judgement call. I do it monthly as I’m in rapidly growing team with many new products popping up. I constantly need to reassess my cost driver.

A more stable team and product might require only 6 month intervals. The relevant point here is; keep it accurate.

Summary

Story point cost ties a rather abstract and developer centered concept to the real world of business. This is necessary. If we intend to use story points in a meaningful fashion in our development environments than they must have some corollary to the spreadsheets, and ledgers that the world’s businesses run on.