WC.com

Thursday, March 6, 2025

Hybrid Virtual

Much of the news about virtual work has been critical of the practice. The topic has been discussed here on various occasions, particularly during the Great Panic. The trend shifted significantly in 2024; see Shifting Virtual (August 2025) and the posts referenced there. The flavor has persistently been of workers perceiving great personal benefit and increased productivity from virtual work and employers being somewhat less impressed.

The end of 2024 brought a massive return to the office (RTO) in America. See Heigh Ho (January 2025). What was hailed as the "Great Comeback" began in the private sector with big-name employers. RTO has spread vigorously, if periodically falteringly, through the federal workforce in 2025. According to the Associated Press, most federal workers will be back in the office this year.

I have repeatedly noted the probability of continued virtual work nonetheless. Seemingly, the probability of working remotely increases markedly with expertise, education, and experience. One of the staunchest critics of virtual work is the CEO of JPMorgan, whose main criticism seems to be that "it doesn’t work for creativity. It slows down decisionmaking.” Nonetheless, he is most critical that "The young generation is being damaged ... they are being left behind.”

This all aligns, and the end of virtual work may come. to paraphrase, "a day may come when the (usefulness of virtual work) fails, ... but it is not this day" (Tolkein, The Return of the King, 1955). The end will likely come sooner for those beginning in any profession; in fact, that may well be "this day." But for those who wield experience, training, and education, that day may be distant indeed.

This all came to mind with a news report from Cleveland earlier this year. Fox8 Cleveland reported that a hospital there is building remote work into a new hybrid model designed to retain talent, train and support the front line, and deliver better overall patient service. The nurses described there work remotely and use technology to connect to the hospital staff. 

Remote medical care is not new. Telehealth existed even before the Great Panic but blossomed in its wake, according to Primary Care: Clinics in Office Practice. This concept was discussed here in 2016, regarding the efforts of Dr. Tearsanee Davis at the University of Mississippi Center for Telemedicine. No, telemedicine is neither new nor novel. See At SAWCA 2016 Annual Convention (July 2016).

The University of Mississippi model, described there, has telehealth professionals contacting remote and rural patients for persistent and near-continuous follow-through and reevaluation. In rural America, there are a great many who cannot access medical care but enjoy broadband access and tablets. These patients used to see a traveling nurse periodically, and now they can get a televisit at will. 

I have spoken to both providers and patients in this program, and they have been complimentary. The patients laud the more ready and regular follow up as their wounds, diet, complaints, and more are a frequent focus. Some enjoy it as much because it interrupts their isolation and loneliness, which itself has distinct value. But, this is a system built on a hospital-based team reaching out to the patient at home.

The Cleveland effort is not dissimilar - the remote nurses are engaged in patient monitoring and care over video connections in the patient's room. However, this concept goes further, connecting hospital caregivers digitally to expertise where/when it is needed. The virtual nurse is thus relieving on-site caregivers of some observation and documentation and providing support, experience, and expertise to those providing in-person care. 

This hybrid delivery model offers a path to retain the experienced and able who have spent years in hospital care but who have grown weary of the commute. They monitor, document, and interact with patients. More importantly, they are "guide(ing) newer nurses through care procedures." The young staff, about whom the JPMorgan CEO (above) is so broadly concerned, are being mentored and supported in this Cleveland example. There is a focus on an environment of professional development.

As noted, I harbor no doubt that virtual work will remain for senior management and diminish for the entry level. As technology continues to improve and resources expand, there is similarly little doubt that communication and interaction can likewise evolve and support both roles significantly. 

There is room in the grand scheme for both in-person and virtual engagement. It will require imagination, productivity, efficiency, and management. More importantly, it will require the in-person contingent to engage and interact. They should be prepared to seek help, ask questions, and converse (yes, I mean old-fashioned phone calls, interchange, and collaboration). 

The remote senior management will not have the visual cues of someone looking concerned, confused, or distracted. They will not run into employees in the breakroom or hallway. They, too, will have to focus specifically on starting conversations. 

For most of us, virtual will remain a distant dream in the world of work. However, for a small group of the experienced and abled, it will remain a reality if engaged well, managed carefully, and delivered effectively. Any worker may aspire to enjoying the benefits of such a role, perhaps not "this day," but "one day."






Tuesday, March 4, 2025

It's Just a Question of When

There is increasing news of people challenged in their job searches. The Great Panic led to significant issues when the government implemented shutdowns and virtual work became common. For a while, employees held the upper hand regarding work, remote options, and more. However, this situation has begun to shift back, as seen in Heigh Ho? (January 2025) and Not Today (February 2025).

 

The Washington Post recently reported that in the tech industry, employee influence has been more established. That story highlights the allure of Silicon Valley, with its generous pay, benefits, and the promise of tackling interesting problems at cutting-edge ventures. The environment offered interesting perks like free meals, dry cleaning, and niche wellness services. While I have never enjoyed such perks, I did work at a pizza place once that allowed me to have unlimited free pizza and beer. Not a bad gig, in retrospect.

 

Those perks and that demand now seem to belong to a bygone era. The Post focuses on workers who are no longer being recruited from project to project. One tech expert with years of software experience has recently applied to more than 140 jobs but received no offers. That must be disheartening, regardless of the circumstances. 140 rejections is a considerable number of "no's."

 

As an aside, people generally don’t like being told "no." There’s a trend on social media called "rejection therapy" where individuals intentionally make requests with the specific intent of being rejected. The idea (not a medical theory - I am not a therapist or doctor; your mileage may vary, use at your own risk) is that you can train yourself to accept "no" and lose the fear of making requests. It's at least an interesting read. Oh, and please send me $1 billion (here's your chance to say "no," and help me with my therapy). 

 

The Post's main point is that finding work is becoming increasingly difficult. Breaking news, finding work has always been challenging. The difference here is that those programmers currently experiencing it have been somewhat sheltered from the "real world" for years as there was exceptional demand for their skills. 


The overall work search challenge situation is expected to become even more competitive as we see changes in the job market. Some pundits predict that federal layoffs in 2025 could be in the hundreds of thousands. These layoffs may prompt related cuts in sectors like retail, service, and food, especially in areas where foot traffic is reduced due to the federal layoffs. Consider the 50,000 job cuts mentioned by the CEO of JP Morgan in Barrons. As employers enforce a return to the office, will employees choose to leave?

 

Any displaced workers will find themselves in either retirement mode, entrepreneur mode, or job search mode. Those searching for work will increase the applicant supply for the existing hiring demands, and perhaps make finding a job harder. Entrepreneurs may create additional jobs, helping to offset some of the supply increase, increasing the demand for labor. This is all a matter of calculus, not simple math. There are bound to be many moving parts. 

 

Anecdotal stories of the job search struggle on social media indicate that many are engaged in long, fruitless searches. A PR Newswire report in December found that 48% of US workers surveyed said they are currently job hunting. This was before the full impact of the "Back to Office” (BTO) push. Employees express frustrations about "phantom jobs, ghosting, bias," and more. In other words, as Everett R. Lake noted:

“It’s a jungle out there, so you best beware.”

The layoffs are real. The present moment is characterized by rollbacks, position cuts, and leaner workforces. Fortune reports that recent years have witnessed mass layoffs and routine cutbacks. Some employees lament losing their jobs despite receiving good performance reviews and producing quality work. They perceive the situation as straightforward—good work equates to continued employment.


One employer has characterized those laid off as underperforming or undercontributing. The employees fear those public statements might stain them as they search for work. Essentially, they feel the news of XYZ corp laying off less than stellar employees may taint the name of anyone with XYZ on their resume. That is also a bit different from the simple good work equals continued employment. How does XYZ gain from impugning these workers? Comments from leaders at these companies have also bruised egos and sparked resentment. 


Other companies have publicly announced layoffs while not discussing employee performance. One noted that the layoffs would result in a workforce that is a "better fit." That seems a more prudent thing to say than subpar performance. Perhaps there is a reason to belittle and demean those who are departing, but I admit I cannot see it. Email me if you do.

 

Nonetheless, the equation is far more complex. If you are the best buggy-whip maker in the factory, the arrival of the automobile still heralds trouble. That may be hyperbole. Too strong? The bottom line is that businesses are focused on producing results. Those who complain that companies merely seek profits need to realize that is what businesses do. 


Working at a business is a value exchange. The worker brings value and is rewarded with value in return. If the worker brings exceptional value, but the business cannot sell it, cannot find a willing buyer, then the quality of the service does not matter. Exceptional? Outstanding? Phenomenal? Even if all are resoundingly "yes," the business may not be able to move the product (buggy whips). 

 

Throughout my life, I have witnessed various jobs transform due to technology. The reality is that tech has reduced demand for some skills. People have had to adapt. Just ask the auto workers who once built car parts in Anderson, Indiana. Ask the folks who built "the tractors and the combines that plowed and harvested this great land." (Month of Sundays, Don Henley, Warner Brothers, 1985). The world of work has changed. Tech has changed it, NAFTA has changed it, and more change is coming. 

 

Is it fair that exemplary employees may not be needed at a company? That largely depends on perspective. If the company has to keep those unneeded employees, the cost of products will rise for everyone. Those domestic labor costs must be competitive with those in foreign markets, or the products and services will not be competitively priced. It is unfortunate, but companies must be competitive. The market is free, and it is worldwide. 


I am amazed at how many people do not realize that not all countries have workplace health and safety regulations. Not all offer family medical leave, fringe benefits, social security matching, workers' compensation, unemployment compensation, collective bargaining, and so much more. These each add cost to employing someone in the U.S. These benefits are critical to Americans. They are valuable. They also cost money that contributes to the cost of goods and services, costs that foreign competition may not face.


Let's all be honest. No one stands in front of the retail shelf and factors in "but this one is made in American and I am willing to pay more to support those workers." I tried it for years, but finding a "Made in America" today is increasingly difficult. When you find something similar, it is often a more passing reference like "assembled in America" or "distributed by _____"(an American company with a address or city stated for effect). Yes, labor is an expensive input cost, and one that includes many U.S. inputs not present in the competitive world market. 

 

Nonetheless, there is a natural and understandable disappointment or disillusionment with being laid off. If you have never been fired, you have no idea how rattling, disappointing, and demoralizing it is. I have fired a few people over the years, and it is never easy. I got laid off once, and that was no easier. Nonetheless, businesses may need to shrink workforce. Layoffs may be as inevitable as hiring sprints. 

 

Ultimately, the outcome is the erosion of job security and trust. Employees feel disenfranchised and disheartened. They observe companies making profits and growing financially while struggling to comprehend why their contributions are overlooked or undervalued. The fact is that we are all cogs in a wheel. We must evolve, grow, and produce. If we fail in that, our end will come. Billy Joel said it well decades ago (Paraphrasing Matter of Trust, 1986, Columbia Records):

 

Some love work is just a lie of the heart;

The cold remains of what began with a passionate start.

And they may not want it to end,

But it will; it's just a question of when.

It is not a life. It will never love you. It is a job, a profession, and vocation. It will end, and you will go on. There is life after. Plan for it now, grow for it now. 

 




Sunday, March 2, 2025

Singularity

Artificial Intelligence (AI) is increasingly influencing our lives. Despite the pervasiveness of AI, I keep running into people who are unsure what it is, how it could impact them, and why they should care. 

Some are dinosaurs blithely strolling toward a retirement that is closer than they appreciate and worse than they dare fear. Others are so busy in their day-to-day that they cannot manage this innovation or its perniciousness. And some are simply unaware that "Denial ain't just a river in Egypt" (Mark Twain, unknown year).

The progress continues nonetheless. In December 2024, Google unveiled its "quantum computing chip." The British Broadcasting Corporation (BBC) says this new chip could decrease computing time for the most challenging computations. The example is an equation that would require today's
fastest super computers ten septillion – or 10,000,000,000,000,000,000,000,000 years – to complete,
which could be completed by the new chip in 5 minutes. This would use "particle physics to create a new type of mind-bogglingly powerful computer." Read that again, 5 minutes for the new chip to do what would take our best technology literally forever. Ten Septillion years is longer even than George Burns lived. 

February 2025 brought news that Microsoft also has a new chip named Majorana. It claims this is "the world’s first quantum chip powered by a new Topological Core architecture," and it will be used to create "quantum computers capable of solving meaningful, industrial-scale problems in years, not decades." Microsoft claims this is game-changing. It says "All the world’s current computers operating together can’t do what" one of its new computers will be capable of.

                                            Courtesy, Microsoft. 

Within days, Penn Today announced it has created "a new silicon-photonic chip" that "uses light waves, rather than electricity." This has the potential to perform calculations at an incredible speed while "reducing (computer) energy consumption. Note that one of the greatest concerns with AI is the volume of electricity required to power and cool its server farms.

Only a few days later, Amazon announced its "quantum computing chip." This is said to have capability to decrease "quantum error," and to enable "quantum computers capable of solving problems of commercial and scientific importance that are beyond the reach of today’s conventional computers." This effort is underway at the California Institute of Technology.

All of this can be summarized with great simplicity:

Computers are going to be faster, more energy efficient, and more powerful than you ever imagined. They will do more, with less, and faster than we can really comprehend. The next generation will be that much better than the last, and that has always been true.

Against that reality comes the potential for "singularity."

This refers to "the theoretical point where machine surpasses man in intelligence," according to Popular Mechanics. There is great debate in the scientific world, and disagreement as to when we will reach singularity. Some say it is a decade away, others claim it will occur in 2026.

There is agreement, however, that it is an eventuality. AI is driving scientific thought, expanded interest, and progress. Though there is disagreement on singularity "when," there is a feeling of consensus that this will "arrive before the end of the 21st century."

There is a confluence of software and hardware driving us forward. We are on a ride that many of us simply will never understand. Our world, way of life, way of work, and more are going to change. The implications are potentially cataclysmic or rapture, or perhaps somewhere in between.

We elderly folks remember a day when computers filled rooms, yet could not outperform today's smart phones. We remember days when entire floors of buildings were crammed with shelves of paper, which today could all be stored as images on a device that fits in your pocket.

Today, we are building data centers to power and house the vast server farms that AI will (does) require. In our present, that investment for 2025 looks to be about $2 trillion worldwide. And it is likely that those data centers will go the way of the Univac computer (below) as chips and other technology continues to evolve, as the pace of evolution increases, and as the world becomes increasingly dependent.

Courtesy U.S. Census Bureau

There are only going to be three kinds of workers in the world in ten years: Those who adapted to AI, those who wish they had, and those who never got the chance. If you are reading this, you are not in the third group, and you have a clear choice in front of you. 

A list of previous Artificial Intelligence and Robotic posts is on DWLangham.com