Does Your Community Meet the Needs of Its Members?

Everyone needs food to eat, water to drink, air to breathe, clothes to wear, a house to live in, and energy to heat and cool it just to survive. And to assure sustainability, add safety, health, and education for good measure. Communities where people live have the same imperative. Either they provide the means by which members can meet their needs or they outsource the responsibility to others some distance away. In the former, communities are on a sustainable path whereas in the latter community survival is put at risk when needs cannot be met.

Needs met is the universal measure of success and sustainability of any community. Does your community meet the needs of its members? To find out, take a quick survey of your community by considering the following three questions:

  • Any community members hungry; homeless; living in unhealthy and unsafe conditions; or are they confronted with no paid work; illness; limited healthcare, and inadequate life skills to influence their circumstances?
  • How do you and your community keep score on needs met?
  • What are you and other community members doing to impact the scorecard?

The answers to these questions speak loudly. If your community is like most, the answer is “Yes” to the first one. It’s almost impossible to eliminate these conditions. However, the next two answers tell the tale as to whether your community is on track to sustainability.

Their order is important. Tom Peters is quoted as saying, “What gets measured, gets done.” As a community addresses the issues identified in the first question, the answers to the second question are calories of food, gallons of water, kilowatts of energy, cubic feet of fuel, units of housing, tonnage of recovered waste, etc. Those metrics shape answers to the third question in the form of infrastructure projects and business cases that deliver outputs according to the scorecard.

In effect, how a community responds to these three questions highlights economic choices made by community members. Those responses can be mapped onto a flow diagram, such as the one below, to give an overview of the local economic system for the community.

In the local economic system above, the drive for local economic sustainability precipitates flows of community assets and work functions that increase in their range of application and depth of value-add as they proceed to a community investment portfolio loaded with infrastructure projects and business cases that, when launched, generate calories, gallons, kilowatts, units, etc., in an effort to meet the needs of community members.

By placing their emphasis on how to meet their needs, community members take ownership for their sustainability. This commitment frames opportunities to charter projects and develop businesses. It also establishes a local economic context for business models that are based on widespread participation.

So, what’s the scorecard for your community? Calories in meals or community gardens? Gallons of water or number of water meters? Kilowatts of power or utility bill assistance? Housing units or vouchers? And so it goes… How does your local economy rate in terms of progress toward community sustainability? Are unemployed and underemployed community members engaged in paid work through local businesses and organizations whose outputs help meet local needs?

These are topics for further consideration in future posts. Stay tuned…

Originally posted to Sustainable Local Economic Development on Tumblr by Steve Bosserman on Wednesday, September 1, 2010

What Is an “Integrated Solution”?

A colleague of mine called the other day and wanted to know how I would answer the question, “What is an ‘integrated solution’?” It seemed that he was entertaining this concept with others in senior management and he was struggling to find an answer that didn’t sound like gobbledygook or pure philosophy and offered nothing pragmatic. In typical management fashion, he needed an answer right away. And, it would be particularly helpful if it could be condensed into a 10-second sound bite that anyone could comprehend. Of course, pressing all that knowledge and insight into a 10-second statement is quite a challenge; one that launched us into a 30-minute animated conversation. Here’s the 10-second version:

An integrated solution:

  • Targets what a specific customer’s organization – business, not-for-profit, government agency, etc. – is providing (portfolio), the manner in which it does that (model), and the context in which it operates
  • Appeals to the combination of values considered most important by an individual customer
  • Provides a package of products, services, and technologies that function more effectively as a whole than the sum of the individual elements that comprise it

So, what is so difficult about that?

While this definition of an integrated solution makes intuitive sense, it challenges management’s conventional wisdom. Here are some reasons:

First, organizations do not really know their individual customers: what each is doing, why they are doing it, or the realities that drive them to do what they do the way they do.

The result: generalizations are made about customer buying patterns, marketing and advertising campaigns based on those generalizations are developed and launched, and products and services are developed and delivered that respond to the aggregated assumptions about customer buying behaviors. However, they don’t really get behind the behaviors to understand the specific dynamics about what the individual customer is doing, why, and how. Instead, the organization strives to stay competitive solely on features and price. To improve their bottom lines, they focus on making what they offer more attractive, cranking up sales / delivery volumes, and reducing operating / product costs.

To lump customers together into various segmentation schemes creates a psychological distance between the provider and the customer and keeps the customer in the position of having to define and acquire the integrated solution.

Second, organizations do not know how to measure the value of what they provide with metrics other than money.

The result: assumptions are made about price, return, cost, and profit that are predicated on a narrowly defined value equation. In other words, the value proposition does not include key factors that drive customers to make buying decisions. Certainly no customer intentionally chooses a portfolio of deliverables and operational model that loses money. At the same time, though, customers are increasingly interested in the impact their operations have on factors such as the environment, safety and security, and quality of life as indicated by the following metrics: What percentage of energy consumed to produce and deliver is green? How far away from the point of use was something produced? Can what is produced be tracked and traced from inception to delivery? How much carbon is emitted versus sequestered in producing it? How much total time does a customer spend to define an integrated solution, bring it together, implement it, and follow-up to see that it functions as intended? What is the balance between a customer’s investments in the operation compared to other of life’s interests?

To focus solely on money misses the point of differentiation that distinguishes an integrated solution: its capacity to answer to multiple drivers within a unique customer’s evaluative framework.

Third, organizations measure their success on providing profitable standalone products and services rather than combinations of products and services that can be easily integrated.

The result: the sale of the products and services becomes the prime objective and additional features and functions are thrown in as incentives to sweeten the pot, beat what the competition is offering, and win the deal. In other words, option packages default to bundling techniques rather than giving the customer the best deal for the price based on improvements in the customer’s business operation. Furthermore, there is often an even greater difficulty in putting combinations together that cross from one brand to another. To connect solution elements from two or more brands, the customer must purchase ancillary parts, components, and modules both in hardware and software. In many instances the customer must purchase an entire system from one brand in order for it to function within a more comprehensive solution even though several elements that comprise the system are still functional. This makes it challenging for the customer to leverage investments in assets.

To assume that customers will always be drawn to purchase a product or service based on its reputation, capability, and price alone is a questionable strategy: technology advances and integration increases; integrated solutions become easier and more commonplace; they become the new baseline from which one enters or stays in a market.

But how long will this take? What if an organization is already doing quite well with standalone products and services? How can the incremental add of selling a solution ever equal the advantage that comes from simply selling more products or services?

Hold an iPhone. Think back ten years. How many pieces of electronic gadgetry would you have to carry to equal what the iPhone can do – if such functionality was even possible? Think past all the things it can’t do or do as well as you like and fast forward five years into the future. What will be the degree of integration you can anticipate then? It’s hard to imagine but one thing you can count on – there will be more integration and in different ways than you thought!

Increasingly, customers have more choices. That is a good thing – Commons to a point. Unfortunately, the customer is left having to sort through countless combinations and possibilities to come up with the best-suited solution alternatives. As technology continues to get smaller, faster, stronger, more embedded, more intelligent, and more integrated, each customer will expect choices to be measured according to their full value as effective integrated solutions. The successful organization of the future is one that establishes its reputation as a trusted provider of integrated solutions. In effect, it will earn the right to be the integrator for the customer. Does this mean that organizations will have to change the way they relate to customers? By all means! To have this distinction saves it from “commodity hell” and positions it for a sustainable future. Integrated solutions: it’s the future that is fast upon us! And that’s the 1-second version!

Originally posted to New Media Explorer by Steve Bosserman on Sunday, July 15, 2007

Thoughts about Value-Add

Value-add dominates our economic scorecard. It is relatively easy to calculate in a manufacturing setting where value is added through material transformation at each step as a product moves from raw material to finished goods. Customers monetize this value by the purchase of products they anticipate will add value to their processes. Value-add also pertains to certain services, like financial and legal, that require a certified, licensed, or bonded provider that possesses or delivers specialized skills or knowledge. The consequence for not utilizing these services is the customer assumes risk.

The concept of value-add also plays a role in information technology and data services. Here, though, the meaning is vague. What value can be assigned to having data or to having data in a usable format? This instance of value is intangible and determined by the receiver of the message. Nowhere is intangible nature of value-add more evident that in marketing strategies and advertising campaigns. Information that induces a user to pay for a product or service has value only to the producer. Value-add for the customer or client occurs at the next step — the point of utilization.

So much for the traditional view of value-add. Here is where the current issues of globalization – localization come into play. A robust business strategy can entertain and exercise both sides!

On the globalization side, value-added products are produced and services provided far from their points of utilization and consumption. Success is driven by appropriate economies of scale. This situation will continue for many years to come as customers and clients exploit lower cost alternatives. On the localization side, value-added products and services are produced in close proximity to their points of utilization. Success here is driven by economies of scope. This situation enables products and services to be integrated into specific applications or solutions that are tailored for highly localized contexts.

How, though, does one put a value-add strategy in place? Re-enter data and information. Essentially, a successful strategy is a contextually relevant plan of action, conceived through the knowledgeable (and hopefully, wise!) interpretation of data and information, and executed by a skillful tactician. No matter how similar or unalike the challenges, relevant data and information are the common denominator.

Assuming everyone has access to the same data and information, there is no value-add. With universal access to the same data and information, there is a significant benefit. The rates increase in the discovery of new knowledge, application of knowledge already learned, and transfer of experience with applied knowledge from one place to another. In other words, accessibility to data and information makes the global human system of knowledge generation and utilization more effective, efficient, and expansive.

Since data and information carry no particular value unless one does not have access to them, they form a unique type of “commons“. Anyone may contribute to the pool of data and information, all benefit, and the quality of what is available is not diminished or compromised by the number of users. In fact, the quality and variety increases with more participants as evidenced by Wikipedia.

This “commons” approach is a cornerstone in “open source” philosophy wherein volunteers contribute entries and edits and the content is free to use. Originating in the realm of software development and usage, open source applies to any instance where people collaborate in the development, sustainability, and scalability of a system whereby end-users freely pull what they need from the system and respond to their unique circumstances. Participants increase the working knowledge about the system as they act locally and provide feedback to designers / developers so they improve the system’s robustness, range, and ease of use.

A comprehensive business strategy judiciously positions an “open source” / “free knowledge” dimension on the globalization and localization continuum. What to share in open forums, what to hold as proprietary and reserve for limited audiences, how much to contribute in the development and sustenance of open source endeavors, how much to invest in products, services, and technologies for satisfactory returns, where to standardize products, services, and technologies for economies of scale, where to proliferate economies of scope solutions within localized contexts – these are the kinds of questions about openness, standardization, and uniqueness that drive effective business strategies for all organization types.

Technology gets smaller, faster, stronger, more embedded, more integrated, and more intelligent. Localization increases. The value-add equation is redefined and a greater significance is placed on unique solutions. Addressing the above questions helps organizations adapt within their ever-changing operational landscapes. The implication being that organizations network and collaborate more broadly to energize, inspire, and focus their subject matter experts where it counts most – learning what customers and clients need in their business and social contexts and responding with value-added alternatives. Isn’t that “business as usual”?

Originally posted to New Media Explorer by Steve Bosserman on Thursday, July 12, 2007

A Broader Framework in Which Localization Occurs

One of the drivers behind technology development is the quest for human equivalence – the point where technology performs at a level of functioning that is equal to or greater than the functioning of the human brain. While it is speculative at best to estimate if and when such a goal is achieved, recent history illustrates that the increase in capability and capacity of technology is ramping up a rather steep slope. And if we are to trust the application of Moore’s law, technology’s prowess is doubling every 18-24 months. At that rate, it doesn’t take much to project a future wherein technology is closing in on human equivalence.

As a trend develops it is useful to be able to track its progress and anticipate its trajectory. Choosing or crafting a set of markers that give indication of a trend’s speed, depth, and scope as it gains influence and becomes an impetus for change is critical. While there are many markers from which to choose, the most durable and universally applicable sets concerns value added, particularly, where and how value is added.

The simple Wikipedia example about making miso soup from the above link is a good one to illustrate how advances in technology change the value-added equation. First, the value of the soup as the end product is comprised of the value added by the farmer to grow the raw product, soy beans, plus the value added by the processor to the soy beans to produce tofu, plus the value added by the chef to the tofu to prepare the soup. This “value package” utilizes a combination of equipment, input, labor, and know-how applied in various locations, stages, and timeframes—and is based on a specific capability and capacity level of technology.

What happens when technology develops further? There are several possibilities: the soy beans are grown in close proximity to the preparer; the yield of soy bean plants and desired quality and characteristics of the beans are increased; the equipment that harvests soy beans conducts post-harvest operations that condition the beans for making tofu; this equipment is smaller and more compact which accommodates localized production; methods of packaging, storing, and shipping soy beans or tofu are more integrated thereby consuming less energy and taking less time. In these instances, advances in technology are applied to the value-added equation dramatically altering the value package. The result is a system utilizing less costly and more productive equipment, requiring fewer inputs and less labor, and deeply embedding human knowledge and experience into new processes and tools. This has the potential to be transformational—and in relatively short order, too!

While the example of soy may represent a somewhat narrow space within which profound change can be noted, it does highlight where and how value added steps are enabled by technology. These changes can be witnessed in a broader sense through the lens of large social and economic “eras.” The first of these, industrialization , brought developments in technology to bear on centralizing facilities, equipment, and people in the production process where capital investments could be amortized through economies of scale.

As production technologies become scalable, logistics are more integrated and efficient, and information and communication technologies are more pervasive, powerful, and responsive, manufacturing operations are dispersed close to those areas where lower cost skilled or tractable labor is available. This is the impetus for “globalization.” Attendant to the distribution of manufacturing capability is the transfer of technology and subject matter expertise. This significantly increases the technological competence of the lower cost workforce. In this regard, globalization heightens the ability of people to utilize new technologies when presented and results in a more evenly distributed capability worldwide.

This puts us on the brink of the next era: localization. The embedded link goes to one of my earlier postings about this phenomenon, so I will not wax on about it again here. However, one quick observation: localization is the inevitable outcome of technology continuing to cost less, get stronger, fit into smaller spaces, run faster, be embedded in more operations, streamline processes, and sense, respond, adapt, learn, and sustain itself despite problems and challenges. To put such an imperative into perspective, the more we transfer technology from one place to another under the auspices of globalization, the more potential we are placing in the hands of the recipients to utilize those technologies in developing localized applications. Constant application of technology that packs more punch at lower cost is what SUSTAINS the drive toward localization. Without technology localization would merely be an updated term for the back-to-the-land movement of some 40 years ago. While localization may imply a different lifestyle choice, it is actually honoring well-deserved quality of life factors while continuing to take advantage of what an improved standard of living provides.

What happens beyond localization as technology continues its trek to become smaller, faster, stronger, etc.? Imagine assembling the end product from molecules – at the point of utilization – precisely at the time it is needed? Yes. Get small enough and one is into the basic building blocks of material: molecules. This is the realm of nanotechnology, specifically, molecular manufacturing.

While such a concept has the earmarks of science fiction or the paranormal and, indeed, there are many who contend it is one or both, technology will continue to shrink the distance from production to utilization until they are as close to the same as possible and the material manifestation will be of the immediacy and convenience of what is conceived virtually. The development timeline for molecular manufacturing suggests a useful output rests some distance in the future and that it will come at considerable expense.

This time is needed. Eric Drexler, one of the leading thinkers in the field of nanotechnology, co-founder of the Foresight Institute, and currently, Chief Technical Advisor for Nanorex, Inc. is a clear advocate for “responsible nanotechnology.” Citing the hypothetical possibility of the world turning into “gray goo” should molecular nanotechnology run amok, Drexler advises the imposition of a stringent ethical framework on these technologies before they are endowed with the capability of self-replication. Not bad counsel regardless whether one buys into Drexler’s future vision for nanotechnology.

And maybe that’s the reason we need to spend time in localization before leaping ahead to what’s next. It is the strength of the community experience where we learn to act upon our value as society rather than default to the strength and survival of the fittest individuals. This is the intent behind the Nanoethics Group. As an extract from their mission states, “By proactively opening a dialogue about the possible misuses and unintended consequences of nanotechnology, the industry can avoid the mistakes that others have made repeatedly in business, most recently in the biotech sector – ignoring the issues, reacting too late and losing the critical battle of public opinion.”

Yes. One can only imagine what happens if the machine – nanotechnology, in this instance – has the unfettered capacity to choose who survives with no more ethical framework in place to guide it than the ones we humans use today…maybe we are not quite ready for human equivalence!

Originally posted to New Media Explorer by Steve Bosserman on Tuesday, July 10, 2007