The Intersection of Philosophy and Physics

Sometimes in life we find that what initially appear to be separate and discreet interests can converge to form a compelling composite which begs to be explored. When this occurs in the context of trying to live an examined life, then a stimulating and energizing endeavor is launched. For me lately, by which I mean the past three years or so, that is occurring at the intersection of philosophy and physics. Let me explain.

As anyone who may have taken any time over the last couple of years to read my essay posts, it is likely obvious that I have been enamored with philosophy of late. There are several reasons for this attraction. For one, as a retiree I have the good fortune of having the mental health and available time to engage in an academic exercise, such as studying philosophy and to the extent I can comprehend it, its companion and contemporary discipline, quantum mechanics. These topics have held my interest for many years, but while living the working life I never could devote the necessary time and concentration required to make any lasting sense of these subjects.

Beyond simply having time and casual interest in philosophy and physics I am drawn to these areas of study for several other reasons. I believe I am not yet too old to use this knowledge as a possible guide to living a more eudemonic or flourishing life with the years I have left. Additionally, as I conduct a life review and reflect on all I have lived and experienced, this study helps me to better understand why things are as they been and why I have engaged with this life as I have.

Finally, I want to prepare myself for what is next after this life. Unlike devout religious people, I have not relied on a prescriptive belief of a hereafter. However, now knowing more concretely that death is more impending than ever before I want to have some comfort in knowing what to expect. In short, I want to have faith in a likely scenario for what will happen to me after I take my final breath.

My informal examination of philosophy and physics began and continued for some while on separate tracks. However over time, I began to see that the two disciplines overlapped in ways I had not expected. Philosophy, while not a social science, is certainly not a hard quantifiable science either. It is too broad, too deep, and too subjective to be considered a science. It is a field of study uniquely its own.

Physics, or more specifically quantum physics or quantum mechanics as it is more popularly known, is indeed a hard science, characterized by objectivity, procedural rigor, and preciseness. So, where is the connection between philosophy and quantum mechanics? It is in a purview of fundamentalism or foundationalism, which I will attempt to explain.

To better understand this conjunction of philosophy and quantum mechanics it can be helpful to know that philosophy and science grew in tandem, emerging jointly from the original pontifications of ancient Greeks who were attempting to explain the core nature of the world in which they lived. Aristotle (384 BCE-322 BCE) can be credited with giving science an early and consequential springboard. He differed from his teacher Plato (c.429 BCE-347 BCE) in some key ways. According to Aristotle, Plato was too steeped in logically determined metaphysical underpinnings that were rooted in abstractness. His philosophical constructs were too perfectly defined and objectively certain for Aristotle.

Rather, Aristotle found it more desirable and necessary to focus on the full range of tangible worldly materials and the way they changed, developed, decayed, and behaved. He thought knowledge should spring from a deep scrutiny of the explicit substances available to us. Hence, what became a western-styled science was given permission to exist.

Science as we know it today did not have a single founder. Neither did philosophy. But when we look back through history to determine the origins of the eventual merger of philosophy and science we inevitably come again to Aristotle. In the fourth century BCE he was philosophically influenced and inculcated by two of the most prominent thinkers of the ancient world, Socrates (c.470 BCE-300 BCE) and his tutor Plato. With that philosophical grounding he went on to expand his understanding of the natural world through the practice of what became essentials of scientific inquiry.

To begin with, Aristotle was intensely curious. This mattered, because curiosity is the launch pad for examination, creativity, problem solving, social progress, and personal development. Aristotle directed his curiosity in the establishment of a systematization of two very human capabilities, observation and reasoning. Focused empiricism and self-guided reasoning together determined the foundation for scientific investigation still in practice to this day.

Among the areas in which Aristotle applied his empirical and reasoning method was to better understand what we call today biology, ecology, and physics. It was in these sciences, including physics which for centuries was known as natural philosophy, that he formalized the practice of disaggregating and classifying the natural world into discreet categories, principally causation, the elements, motion, and teleology (purpose-drive goals). Although many of his specific predictions did not stand up to the scrutiny of time, Aristotle’s three-way utilization of careful examination, logical reasoning, and classification nevertheless set the stage for the development of today’s scientific method.

Notice that Aristotle was drawn to a process which tried to base conclusions about the nature of reality by identifying and examining what he believed to be the constituent parts of reality. To better comprehend the totality of all there is, Aristotle determined it necessary to first apprehend the parts of all there is.

Aristotle was not the first of the ancient Greek philosophers to reach for a method we now call reductionism. A reductionist approach attempts to describe grand and intricate events and occurrences by minimizing, analyzing, and viewing them through their elemental segments. Key pre-Socratic philosophers also employed a similar technique.

Thales of Miletus (c.626 BCE-548 BCE) is another historic figure credited with early scientific thought. He proposed that everything in the known universe could be reduced to a single ingredient — water. Anaximenes (c.586 BCE-526 BCE) suggested that the fundamental element was air. And Heraclitus (c.535 BCE-475 BCE) offered that fire was what everything was derived from. The tendency to reduce the universe to its most basic workings has set the tone for how westerners contemplate and envisage all that there is from the beginning of recorded history.

As we see, the ancient Greeks set western science on a course of reductionism, which again can be simply explained as reducing complex circumstances or phenomena to basic and underlying components. To be sure, reductionism has driven science to comprehend a view of reality which has resulted in many remarkable discoveries. Through reductionism we have refined our ability to study phenomena, make predictions, and determine primal laws of nature.

As science, and in particular physics, matured the individual whose approach and legacy keenly exemplifies reductionism was the English scientist Isaac Newton (1643-1727). Whether it was in his works related to the laws of motion and gravity, optics, fluid dynamics, or in mathematics Newton applied reductionist thinking so as to better understand the nature of reality and predict natural processes. His procedures and methods have led to what has become the conventional manner of perceiving the known universe.

Newton’s influence on science and western thinking has been huge. Many of the services and products to emerge from applied science, which have had an immensely positive impact on humankind, can be credited to the profound influence of Newtonian schema and methodology. Historians claim that Newton’s contributions revolutionized science in disciplines ranging from astronomy to engineering and that modern physics and mathematics are attributed to his reductionist guidance. His analytic thinking that viewed natural phenomena through essential principles and equations remains extensive.

However, for all of the gains reductionism has brought to our world there has been a myopic and restrictive perception of the universe that has developed and hardened since the seventeenth century, such that conventional wisdom and commonplace thinking about the nature of reality is exceptionally mechanistic and based very heavily on rationalism.

Thanks to Rene Descartes (1596-1650) western thought took a sharp turn into the advancement of reason, which allowed for a skepticism to emerge about the reliance on Plato’s and Aristotle’s influence on scholastic thought, but also to question the power of the Church to dictate enforced beliefs. Among the great consequences of Descartes’ life work was to extensively influence a novel and rational pattern of western philosophical thought and by extension induce a metaphysical and scientific view regarding the nature of reality that exists to this day. Descartes can be credited with establishing a revolutionary intellectual environment in which Newton could pursue his creation of the new physics.

Descartes was committed to discovering the most basic truths of reality and did so by attempting to determine the most foundational aspects of knowledge or epistemology. In essence, Rene Descartes applied reductionism in an attempt to unveil how we as humans could understand the fundamentals of reality. He skeptically stripped away all of his preconceptions and premises about the world to search for that one incontrovertible truth marking the starting point for the thoughtful and aware self. “Cogito, ergo sum” became that target. “I think, therefore I am.”

The reductionist epistemological method used by Descartes contributed greatly to validating reductionism as a technique applicable to comprehending all that there is. As science developed into a set of disciplines in the years following Descartes, we see reductionism widely used as a process for peering into the nature of complex systems. Indeed, it is the approach of perceiving complex systems through reductionism that both helps and hinders our understanding of natural occurrences, especially when it comes to science.

As mentioned earlier, there can be setbacks to relying on reductionism to reveal answers to the mysteries of existence. To better understand I will begin by noting that there is no more complicated and elaborate structure than the universe. Reductionism attempts to simplify this vast complexity by identifying individual elements, which the thinking goes, combine to make the whole. As we are learning over time, intricate systems such as the universe involve more than parts. They also manifest qualities and processes that cannot be captured through an inventory of components alone.

For example, let’s look at consciousness, the phenomenon expressing our subjectivity and our sense of self. Is consciousness really just a result of brain action as in synapses among neurons or is there a more holistic, non-quantifiable, and universally fundamental process at play resulting in our experiential mindfulness? I would say, yes, that is very possible. Consciousness manifests as too miraculous and too illuminating to just be an outcome of the conduct of matter. Reductionism is too austere a method for explaining the richness of consciousness.

Synergy is a term referencing a type of alchemy. A force or efficiency is achieved within a complex system when its constituent parts interact such that the system’s overall effectiveness is measurably greater than the sum of the individual parts. How does this happen? It is counterintuitive and contrary to what basic arithmetic tells us. Synergy is a way of say two plus two equals five. Something magical appears to happen when components interact cooperatively resulting in the whole being more than all of the portions added together.

Reductionism misses synergy in its calculation. There are existing and emerging properties not easily discerned by mathematics and science. Properties that have their origin in a generative spirit the ancient Stoics referred to as Logos (more on Logos later). Systems, no matter how complex they may be, are part of larger systems. Sequestering easily identifiable components can miss the valuable interactions in force. Interconnectedness and constant exchanges inject dynamism and vitality to systems. Reductionism does not always do a good job of accounting for such interoperability.

Reductionism can be seen as a practice stemming from a belief in materialism or physicalism. Materialism asserts that all of reality is composed of physical materials. This view rejects any notion of a separate spiritual reality or of a distinct existence related solely to mental states or consciousness. All that there is can be explained by physical substances and the laws governing their actions. In this type of reality reductionism makes sense. Just keep slicing and dicing until you get to the most essential particles of existence.

So why you might ask, is there this criticism of reductionism and by extension materialism, a methodology that has been practiced for centuries and which has led immeasurably to the betterment of humankind? Credit for this mental approach of discovery deserves to be given to reductionism as mentioned in reference to Isaac Newton above. However, the past hundred years has begun a grand and paradigm changing revelation highlighting the limits of reductionism. This relatively newly learned lesson comes in the form of quantum mechanics.

As science has continued to dig deeper and peer ever further into the material universe it has run into a roadblock of sorts. Strange things are occurring at the quantum level of reality — so strange that what we have thought for centuries about materialism and its character is now being reassessed. The behaviors and processes of matter and energy at this level upends our understanding of the known universe.

Quantum mechanics is the most recent approach to the study of modern physics, which began in the 1920s. It is a study of the most fundamental conduct of matter and energy occurring at atomic and subatomic stages. Physicists appear to have largely run out of runway when it comes to discovering the next smallest particle. But of special note is the fact that we enter a bizarro world at the quantum level that seems to question the linear and sequential order of things we have been accustomed to.

Let us take a look at some key examples of the conditions goading the quantum game changing reality.

Our journey into counterintuition best begins with a look at wave-particle duality. As best physicists can tell, the most quantum materialist entities discernable are particles, the most commonly known of which are photons (light) and electrons and protons (both subatomic). What is noticeable is that these particles along with other quantum particles exhibit both particle-like traits and wave-like traits. For example, a key particle-like property is discreteness, in which a quantized energy level or value is detectable. In the case of wave-like properties an example is wavefunction, a mathematical statement providing probability magnitude of a particle’s location in space. One is left questioning, is the most basic constituent a particle or a wave or are they somehow unified?

Wave-particle duality leaves a novice student such as myself thinking that everything, including matter, is energy. I see no reason to date to think otherwise. The other consideration of note is that wave-particle duality is a good starting point for learning about the other unique and odd discoveries of quantum mechanics and of the most fundamental particle entities (also known as quanta). It is safe to say that classical physics, such as Newtonian mechanics, electrodynamics, thermodynamics, and optics, ceases to be applicable at the quantum level of physics.

Superposition really rocks the world of how we thought things were. This term is used to describe particles being ubiquitous or in multiple states at the same time. It is only during an attempt by an observer to measure the state or position of a particle that the “wavefunction collapses” into only one state or position from among the many states or positions it could have been in. It is like Jim being in Moscow and New York at the same time, but until an observer intentionally spots Jim in Moscow can we say that there is where Jim is located at that moment.

Entanglement is just as irrational. In this phenomena we can have two particles entangled or influenced by one another, sometimes at great distances, i.e. nonlocality. The condition of one of the particles can be instantaneously affected by the other one even at distances where there should be a time lag due to the immense separation of space between them. Einstein remarked incredulously that such an occurrence was “spooky action at a distance” since an information exchange appeared to be occurring between the two particles quicker than the speed of light.

As you can see quantum measurement is a tough thing to nail down. How to measure key features of quantum entities remains a controversial and debated issue. When the very act of attempting measurement appears to affect the nature of the entity being measured how can one know its state in unobserved reality? In fact, one can wonder, is there such a thing as an unobserved reality?

As we are seeing, it makes sense that a central standard of quantum mechanics is known as the Uncertainty Principle. At the beginning of the quantum age in the 1920s, German physicist Werner Heisenberg (1901-1976) concluded that measuring the speed and location of a particle could not be accomplished accurately. In the nearly one hundred years since Heisenberg proclaimed the Uncertainty Principle it remains a valid concept. Once we dive deep enough into the quantum realm reality takes on a whole new meaning — one that is hard to wrap our minds around.

So what am I to apprehend from this convergence of philosophy and physics? What are my takeaways at this point in my understanding of this information and why should they matter? Does any of this change my perceptions to the degree that I think of the world differently than I did before? To begin answering these questions I will note what conclusions or beliefs I have from the above descriptions.

At heart, nature is my guide. I believe there is a natural process to the universe, an unfolding always occurring. Science and philosophy are lenses through which to view nature and from which to infer the basics of reality. Learning from nature is not as easy as just observation, however. Our six senses give us direct experience with reality, but they are also limiting in the amount of insights we can derive from nature. Something more than sensorial experience is needed. We humans are capable of integrating a non-sensorial dominion into our imaginations that can complement our rational comprehension of all that there is.

I have faith in Logos, the generative spirit introduced to us by the ancient Stoics, as my gateway to the non-sensorial realm. It is what produced the Big Bang and establishes the entire order/disorder of the universe. Logos is ubiquitous and present from the grandest structures in the universe to the quantum level. The expression of all physical and mental states have at their essence Logos. I believe this spirit is what is meant by a belief in God.

Dutch philosopher Baruch Spinoza (1632-1677) advocated for a notion of pantheism, the idea that God is in everything. He taught there does not exist a transcendent God separated from the creation or nature. Pantheism, and its modern secular counterpart panpsychism, captures what feels right to me. Logos is my starting point and the place to which I frequently return for apprehending reality.

Beyond a faith in Logos I face a significant challenge. It appears I am searching for certainty in an uncertain universe. Quantum mechanics tells me that we cannot be sure about much, if anything, at the core of reality. Nature is not boundlessly dissoluble. We can only slice and dice or reduce just so far. A point in reductionism arrives when we enter a province that is unpredictable, random, contradictory, and contingent.

Christopher Bader, principal investigator for the annual Chapman Survey of American Fears, notes that the many fears Americans share can be traced back to uncertainty. There is a self-help expression encouraging us to embrace uncertainty. Yes, accepting chance as more likely than conviction is a key upshot for me and perhaps it should be for others as well. This encourages me to be more agile in my thinking and less definite in the conclusions I draw.

The brain, of course, can do many things. Among them is a capacity to ensnare chance and possibility. It is also stochastic in how it operates, meaning the brain can be inherently random in how it processes inputs. Many of the results the brain yields are not predictable and based in certitude, but rather are presented as probabilities and statistical distributions. This may explain our ability to be creative and to have novel ideas.

As Dartmouth College neuroscientist Peter Tse suggests, because the brain functions such that new configurations of thought and conception are possible, this is likely an indication that the universe performs this way also. A reasonable hypothesis is that quantum processes with all of their stochasticity are manifesting in our brains and reflecting the workings of the universe at large.

One of the great controversies and mysteries of both philosophy and science historically concerns the question of whether the universe is deterministic or indeterministic. If deterministic, then all that has occurred since the Big Bang is preordained and destined to happen like the sequence of events in a movie. There are no interventions which can change destiny. All has already been programmed.

On the other hand, indeterminacy allows for capriciousness and irregularity, in other words the very randomness quantum mechanics indicates is commonplace. Change, process, and uncertainty are innate and part of the fabric of our reality. Learning to not resist this primary aspect of our universe seems wise.

I will finish with this observation. Of course our lives have a lot of predictability despite all of the evidence suggesting otherwise. The sun rises in the morning and sets in the evening. Spring still follows winter every year. Death ends lifetimes. The past has occurred. The future is yet to be experienced. And the present is ever fleeting. Life is not easily understood and the more we try to make sense of it the more questions are generated. That said, the intersection of philosophy and physics is a fascinating place to be. There is much more to learn. I imagine I will continue to visit this place often.

 

 

 

 

 

 

 

The Mixed Story of Women in the Workforce

First to the good news for women in the workforce. Women in America are enrolled in greater numbers in higher education than men. According to the National Center for Education Statistics, during the fall of 2021 female students comprised 61% of the higher ed student body with men at 39%. A year earlier the stats were 58% female and 42% male. Projections for 2030 indicate that there will be 2.37 million more women in postsecondary institutions than men. The trend is clear. Women are more drawn to improving their levels of education compared to men.

This was not always the case. In 1970, male enrolment outnumbered women registrations. By 1980, the admissions records were at parity. And now here we are. The result of this direction shift should tip the balance of education’s benefits toward women more than men.

What are these benefits? Even at a time such as ours when the high cost of college education is causing more people to question its return on investment, there are still documented advantages to getting an undergraduate degree. These include:

  • Higher earning potential and incomes
  • More employment possibilities
  • Increased job security
  • More abundant compensation packages
  • Enhanced personal development
  • Greater networking opportunities
  • Improved job satisfaction

It is not a stretch to predict that these merits will eventually give women the edge in business leadership and economic clout. A feminization of the economic picture may or may not be an overall gain. That has yet to be seen. Will competition be strengthened by defanging it to some degree or at least softening its sharpness? Again, this has yet to be seen.

However, the outlook for women employment writ large is not so rosy across the board. Among the policy-driven data Third Way examines is the non-college economy. And in this category their data dive into the numbers provided by the Bureau of Labor Statistics (BLS) reveal a troubling forecast for women who do not pursue a college education.

Estimates are that over the next decade the rate of job loss among non-college women is expected to increase significantly. Unfortunately, many non-college women are currently employed in industries that are predicted to decline.

So, let’s look at the big picture. BLS has identified the industries projected to decline economically and by extension employment-wise over the next ten years. Among these industries, 97% of the positions not requiring a college degree will disappear. Incidentally, 60% of these job losses are now middle-wage jobs. And here is the kicker. Two-thirds of these precarious jobs are currently held by women without a college degree!

Most non-college women work in jobs considered low-wage or middle wage. Examples of low-wage jobs include cashiers and fast-food cooks. Middle-wage jobs are like office clerks and retail sales supervisors. Historically, middle-wage jobs provided the means for women to support themselves and to get established in the middle class. With many of these jobs facing elimination, the strain on non-college women to afford middle class lifestyles will become more pronounced.

To add insult to injury, it is these middle-wage jobs that are most likely to be abolished, even when compared to the low-wage jobs. In fact, low-wage jobs, those under $36,700, are under less threat according to BLS than the middle-wage jobs. If true, it becomes easy to see that a migration of non-college women from middle-wage to low-wage work is likely.

The decline of middle-wage jobs is largely being caused by automation and outsourcing. And who knows to what extent Artificial Intelligence will acerbate this movement? Examples of middle-wage jobs include:

  • Administrative assistants outside of legal, medical, and executive
  • Customer service representatives
  • Assemblers and fabricators
  • Bookkeeping, auditing, and accountant clerks
  • Frontline office supervisors

One possibility to avert this disturbing development is to hope the proliferation of industry credentials, certificates, and badges which qualify women (and men) for middle-wage positions without the need for college degrees will continue. Although such credentialing will not replace college degrees, in the short term they may stem the tide of disappearing middle-wage jobs.

Another thought is that the college educated women who will have more decision making authority in the future will design economic and employment solutions for the women who have been unable to go to college. My fingers are crossed.

 

When Considering an Encore Career

I recently attended a high school reunion. This was not the typical high school reunion, which is attended only by alumni from your graduating year. I attended a private all-male boarding school in the Berkshires of western Massachusetts, which operated from 1926 until 1971, after which time it closed.

So, reunions for this school include any surviving alumni from any year during the time the school was open. This most recent 2023 reunion included alumni ranging from the graduating year of 1948 until 1971.

As you can imagine, nearly all of the attendees are now retired from their careers. But not everyone. As I chatted with a number of alumni I found that among those not fully retired there were two distinct categories of workers.

There were those who continued working at their primary careers, but at a more reduced or dialed-down level, meaning they were not putting in the same amount of time or handling the same degrees of stress as when they were full time employees.

Then there were those who desired to continue working, but at some type of work which was either very different or tangentially related to their former employment. This latter category is sometimes referred to as an encore career.

One of the great benefits of both our current labor force and our prolonged healthy lives relative to previous generations is that we have an option of pursuing an encore career. Establishing one, however, brings a new set of challenges that an older individual needs to be prepared to confront.

Just because you present yourself as an experienced and reliable resource with a long track record of accomplishments does not mean you will automatically be seen as a shoo-in for the new gig. In fact, the case most often seems to be that your age decreases your chances of being accepted. This requires that initiating an encore career be done systematically and attentively.

To begin with do not shy away from being old, but instead embrace it and spin your advanced age as a positive. You have gained a lot of work experience, solved many problems, and built an in-depth skillset.

Emphasizing your general tenacity, dependability, and trustworthiness can go a long way to gaining stakeholder and customer trust, which in many cases is as important or more critical than expertise alone. People who will need your services or who will want to join with you in delivering services want the comfort of someone they can rely on. Gaining that trust early on is crucial.

Another key to attaining trust is to highlight connections between your past successes and what you are promising to deliver in your new role. There will be overlaps in type, quality, or circumstances linking accomplishments previously achieved with intended future benefits you propose to supply.

One way to identify and credibly discuss these junctures is to prepare responses to some of the toughest questions you could get in an interview or from prospective customers during a vetting process. If needed, gain assistance from trusted contacts who can be skilled in playing the skeptic forcing you to justify your claims.

Through rehearsal, anticipate the concerns from others whose trust and support you will need to succeed in your encore career and heighten your authenticity by eliciting how your past performance has prepared you for future challenges.

Also, throughout the longevity of your career you have hopefully cultivated and maintained relationships with work related individuals which span generations. Being able to depend on younger professionals who can vouch for your excellence can go a long way in polishing your new brand.

Show others that you are not just a monument to legacy ways of operating, but that your instincts and inclination are toward continuous learning and improvements with an attitude of welcoming new problems to solve. Demonstrate how you are still passionate about the work you want to do, even at this late stage in life.

 

 

 

Career Passion and Wellbeing

It is a conventional understanding, sometimes expressed explicitly but often simply assumed, that if we are to work for a living, then our efforts yield richer rewards if we have a genuine passion for our career choice. Passion, we are told, is the greatest motivator. It is what compels us to willingly throw ourselves into our work and to perform at our finest with no external stimuli needed.

“Do what you love and you’ll never work a day in your life”, is the way the maxim goes. What could be better than to feel such fervor for your career?

It is easy to see how management would be delighted to have workers who are “naturals” at fulfilling functions necessary for the prosperous advancement of a business or organization. Such employees will require few if any performance incentives. They are self-motivated players who embrace being subject matter experts. In their hands, productivity should reign without the problems associated with someone displaying less ardor for their work.

Workers who view their careers as vocations rather than as jobs are a precious resource for any enterprise. Managers who realize this will do what they can to facilitate conditions designed to enhance employee wellbeing and sustain the valuable assets they have. Conversely, managers who see very dedicated employees as a never ending supply of production and who develop an attitude that these workers will always go the extra mile, because to go above and beyond is inherent in them, could very well find they have squandered an advantage.

Even for those for whom their work is their calling, respect and care must be regularly demonstrated by management if this talent is going to remain committed to the organization and to do their best work. The results of a research study on the topic of wellbeing released by Gallup, Inc. in July 2023 reveal pertinent findings that leaders should know if they are serious about holding onto their best and brightest.

To start, Gallup finds that only one in four workers think their employers are concerned about their wellbeing. This is true in the U.S., U.K., Germany, and France. The abysmally low number is historic as well. Except for a brief period at the beginning of the pandemic, when many workers thought management cared about their health and welfare, this only ~25% who feel cared for has been the norm.

It is simply good business for management to genuinely support their best workers. To quote from the authors of the Gallup research, employees who believe management is dedicated to their wellbeing are:

  • Three times more likely to be engaged at work
  • 69% less likely to actively search for a new job
  • 71% less likely to report experiencing a lot of burnout
  • Five times more likely to strongly advocate for their company as a place to work
  • Five times more likely to strongly agree that they trust the leadership of their organization
  • 36% more likely to be thriving in their overall lives

These powerful statistics strongly suggest that structuring a workplace so that all employees, and in particular the most valuable talent, are emotionally and substantively gratified goes well beyond just being a nice thing to do, but actively works toward fulfillment of an organization’s mission.

Wellbeing in general involves not just career, but other social, financial, and health related factors. And of course, it is ultimately up to each individual to shape their lives so they are living optimally. However, given the amount of time and energy careers require, this is an area of life demanding special attention. Wellbeing should be a fundamental organizational issue as well as a personal responsibility.

Even the best employees deserve to know they are truly valued. To operate as if it is totally up to each person to independently feel fulfilled by others while on the job, leaves the workplace vulnerable to low productivity and weak competitiveness.

Innovative Journeys: Unlocking Opportunities for Creative Recognition

The world of creativity is a realm brimming with boundless imagination and innovation, yet the journey to being discovered often proves challenging. Emerging artists, fashion designers, and makers confront the daunting task of standing out amidst a sea of talent. Fortunately, there are strategies that can illuminate the path to recognition. Leslie Campos of Well Parents unveils seven proven paths that creative minds can tread to claim their spot in the spotlight.

Why an Online Presence Is Essential for Creatives

In the digital age, an online presence acts as a creative’s virtual storefront, open 24/7 to the world. A captivating website serves as a central hub where admirers can explore their portfolios and learn about their journeys. Social media platforms amplify their reach, enabling direct engagement with an audience that resonates with their artistry. A curated online portfolio showcases their work’s evolution, providing a snapshot of their creativity and growth.

The Importance of Networking for Creatives

Networking is a cornerstone of success in the creative industry, and industry events serve as fertile grounds for making connections. Conferences, trade shows, and exhibitions allow creatives to interact with peers, mentors, and potential collaborators. Showcasing their work at these events positions them in front of a relevant audience, leading to exposure that can propel their career forward.

How Competitions Elevate a Creative’s Profile

Competitions aren’t just about winning trophies; they’re gateways to recognition. Participating in renowned competitions places a creative’s work on a prestigious pedestal, catching the eye of judges, fellow artists, and industry experts. Even if they don’t claim the top prize, the exposure garnered from competing can lead to invaluable opportunities, solidifying their position within the creative landscape.

The Power of Collaboration in Fueling Mutual Creative Success

Collaboration is a potent catalyst for creative success. By teaming up with fellow artists, designers, or makers, creatives can tap into new perspectives, skill sets, and audiences. Collective efforts amplify their reach and introduce their work to fresh admirers. Collaborative projects also nurture a sense of community and mutual support, reinforcing their position within the creative ecosystem.

How Community Events Can Propel Creative Careers

While the digital realm is expansive, local exposure should not be underestimated. Participating in pop-up shops, art fairs, and community events fosters connections with neighbors and fellow creatives. This local support serves as a foundation upon which national and international recognition can be built. The media attention garnered from such events often serves as a springboard for further exposure.

The Role of Workshops and Courses in Enhancing Creative Skills

A creative mind is a constantly evolving entity. Engaging in industry-specific workshops and courses refines skills and expands professional networks. Learning from established experts and sharing experiences with peers fuels personal growth. The insights gained from workshops empower creatives to adapt to evolving trends and challenges, ensuring they remain at the forefront of their field.

Reaching the Right Audience at the Right Time

Technology is a beacon of opportunity for creative professionals. By embracing customer data management systems, this is a good one to consider. Creatives can personalize their interactions with admirers, enhancing engagement and loyalty. Informed decisions based on data insights streamline marketing efforts, ensuring their work reaches the right audience at the right time. This technological approach revolutionizes how creatives connect with their supporters.

The path to creative recognition is a multifaceted journey, but with these seven proven paths, emerging artists, designers, and makers can pave their way to the spotlight. By crafting a powerful online presence, participating in industry events, embracing collaboration, seeking local exposure, expanding horizons through learning, and leveraging technology, creative minds can rise above the noise and shine brightly. As they embrace these strategies, they empower themselves to confidently navigate the creative landscape, ready to claim their well-deserved place in the limelight.

Image via Pexels

 

LinkedIn Has a Profile Problem

Those of us who use LinkedIn often have noticed a troubling trend recently. An increasing number of profiles, those displays which serve as defacto online resumes, are fake. A growing number of profiles, which on first appearance look legitimate turn out to not be at all. This took me by surprise when I first realized it was going on. Why bother creating fake LinkedIn profiles? What is to be gained? Well, scams are the reason and it is keeping LinkedIn busy. Salaria Sales Solutions reveals that over the first half of 2021 LinkedIn deleted 15 million of these fake profiles.

Personally, I have noticed them as a result of questionable connection invitations. One kind I see most commonly involves a profile picture of a beautiful young woman who holds an impressive job title for a well known corporation or law firm or entrepreneurial venture or whatever that seems almost too good to be true. The first few times I saw some of these I felt the need to scratch a curiosity itch, so I conducted some research to see if this person really held that job. I could never confirm that they did. Hhmm?

The other kind I see most often are profiles coming from African countries like Ghana, Kenya, Gambia, and Uganda. Now, I like the idea of connecting with real workers from Africa and learning about the employment and economic challenges they face. But this is not what happens. Sooner or later the pitch for money comes.

It turns out that scammers, who are notorious for knowing how to fish where the fish are, realize that the valid users of LinkedIn, of which there are approximately 900 million, make up a relatively high-income user base. That alone incentivizes them to attempt exploitation. The scams are varied. A common one is phishing, which is a con to get you to reveal personal data. Aura.com reports LinkedIn phishing attacks have jumped 232% since February 2022! Related to phishing are employment scams in which fake job listings are posted. “Recruiters” ask for personal data like bank account or Social Security numbers and bang, they’ve got you! They disappear and you are left critically exposed.

Catfishing is a romance scam in which the scammer tries to emotionally hook you into an online relationship leading eventually to a money request. Another one to look out for is the crypto investment scam. Here victims are persuaded by a crypto-savvy sounding con to invest in crypto currencies using an authentic looking exchange only to find out your “earnings” are just digital numbers on a screen that cannot be withdrawn while your initial investment is in their pockets.

Although LinkedIn is just the latest platform to be tainted by scams, this raises the question of how genuine online connections in general are. In our increasingly digitized and remotely connected lives, both personally and at work, we are becoming called upon to establish relationships with individuals who we may never meet in the flesh. Can we ever know that the person we are communicating with is really who they claim to be?

Clearly establishing and confirming online connections has become a new skill to master. Knowing how to avoid scammers is important enough, but basic social ethics compels us to want to know that the people we are communicating with are real. We need to be wary of deception in our online dealings. Unfortunately, this means we need to either learn how to conduct background checks or find services that can perform background checks for us. Perhaps this is an area where AI can become useful.

What a world! Loneliness is at epidemic levels and social media gives us the ability to connect with more people than we ever could in the physical world. And now we need to be concerned not only that people we communicate with are actually who they claim to be, but whether or not they mean to do us harm. Yes, what a world!

 

 

 

The Data Dilemma

Data-driven decision making has been the rage in business for some time now. The collection of data based on what are determined to be key performance indicators specific to a business is used to justify the outcome of higher quality organizational decision making, which in turn drives the formation and execution of business strategy.

Data aggregation can be crucial with tasks as basic as answering difficult questions to more sophisticated functions such as developing and testing hypotheses and formulating theories. Further benefits of having vital information available include risk evaluation, resource allocation, program and policy assessment, and performance measurement. The use of data is seen as a more efficient means of implementing these activities than would be the case in relying on intuition, hunches, and observation alone.

It is hard to argue against the utilization of data in running a business. McKinsey reports that data-driven organizations are twenty-three times more likely to attract customers, six times more likely to retain those customers, and nineteen times more likely to be profitable.

It is not difficult to find examples of how data-driven decision making has led to business success. Anmol Sachdeva, an independent marketing consultant, has researched several. For instance, Red Roof Inn positions their hotels near airports. They figured out how flight cancellation data and weather reports could be combined to increase their bookings. Netflix compiles user data concerning watch time, location, and programming preferences to predict which shows will become big hits. Coca-Cola leverages social media data to determine who mentions or posts pictures of their products. With this data, personalized and targeted advertising has led to a four-time increase in clickthrough rates.

In short, data collection has become the proverbial game-changer for business. By helping organizations pinpoint the factors that better address challenges and boost productivity and profits, data and its astute analysis, is now an essential component of business success.

However, despite the advantages of focused data there can come a point in which the zeal to collect information can become extreme and intrusive, particularly for employees. Of course, it is reasonable to expect that management would want data to make improvements in productivity among their workforce. Performance metrics can be used to spot shortcomings, training needs, and specifics for employee performance evaluations. An organization that leans toward a results-only performance model for their employees need objective data more than a manager’s potentially skewed or biased interpretation of how employees discharge their duties.

Quantification can be misused if it is used to go beyond the reasons stated above. There are now too many documented instances of employees being excessively monitored such that the workplace has become a surveillance culture. A career cannot thrive in a context where someone is always looking over your shoulder. Questionable monitoring may be the result of management wanting to identify organizing threats such as unionization communications. Or maybe the surveillance is used to spot ways of automating tasks so as to reduce the workforce. In extreme cases, data may be applied to limit wage growth, exploit labor, or even discriminate.

Amazon may be the poster child for such data fanaticism. Brishen Rogers, an associate professor at Temple University’s Beasley School of Law, notes how in 2020 Amazon sought to hire two “intelligence analysts” who were to use data analytics and other means to find “labor organizing threats” from among the Amazon workforce. The company goes on to insist their outsourced delivery providers hand over geo-locations, speed, and movement of drivers to use however the company wants. Inordinate corporate surveillance has also been chronicled at Uber, Lyft, Tesla, and Apple.

Inappropriate use of data results in loss of privacy, greater stress. and increased pressure on workers. The workplace can become a place of distrust and fear, not an environment conducive to innovation, high morale, and career enhancement. Instead, let’s insist that data collection be ethically construed, transparent, and legally justifiable.

 

The Need for Versatile Leaders

There is no shortage of disruptions to our workplaces and to our careers. They come in two styles, one transient and the other sustained. There are the short-lived perturbations, for example our current experiences with inflation, Covid, the war in Ukraine, and spotty supply chain shortages. Then there are the disturbances which have roots in recent history and continually transform, such as the evolutions of globalization and technology, including the advent of generative AI. Taken as a whole, it can seem as if there is little time for complacency or work that is of slow tempo. 

Managers seem especially exposed to the fluctuations and inconsistencies of the modern workplace. They are called upon to guide direct reports through turbulence and insecurity while attempting to follow strategic policies. This can be quite challenging. The way leaders handle threats and turmoil matters for the health of their careers and of the careers of workers who are impacted by managers’ approach to volatility. 

Versatile leaders have been identified as valuable resources for a workplace to have during times of upheaval. They can be beneficial when the need arises to manage resources efficiently to remain productive. Maintaining employee engagement and adaptability during periods of uncertainty requires a special kind of leader. Organizations are increasingly aware of how important it is to have versatile leaders. 

Rob Kaiser of Kaiser Leadership Solutions and Ryne Sherman and Robert Hogan, both of Hogan Assessment Solutions, have been studying versatility in leadership for twenty-six years. They note how from the late nineties to the mid-2000s co-worker ratings of leadership identified the trait of versatility as an important leadership trait 35% of the time. By the time of the Great Recession in 2008, versatility was seen as a significant leadership attribute in 50% of the ratings. And by the time of the pandemic, it shot to 63%. The demand for versatile leadership is growing in recognition. Given the rate of change expanding as it is, it is easy to see why. 

Kaiser et al define versatility as the leadership ability to function effectively in a context characterized as volatile, uncertain, complex, and ambiguous. Within that setting, versatile leaders can quickly adapt by applying a range of appropriate skills and behaviors that reshuffle and redeploy resources to preserve productivity. This type of leadership manifests in two distinct ways. One style is more forceful and direct as in a single point of command tasked with making the hard choices. The other approach reaches out to employees in an empowering and supportive way to provide tranquility and to ease concerns. The skilled practitioner of versatility knows how to shift between these modes as the situations dictate. 

In fact, a leader who may be well versed and experienced in one of these modes, but unable to adroitly shift to the other does not qualify as a versatile leader and indeed may be a lower quality leader overall due to their situational limitations. However, the good news is that versatility can be an acquired capability. Counterintuitively, versatile leaders are not correlated with any specific personality type. To the contrary, versatile leaders are represented across multiple personality types. Given that the research of Kaiser et al identifies fewer than 10% of the leadership workforce as versatile, the incentive is there for increased versatility training. 

Although personality alone may not be a strong predictor of versatility other background elements are. It has been documented that leaders who have had many kinds of work experiences requiring the development of a diverse range of skills in circumstances for which they were not already highly qualified can be de facto versatility training. The more a leader finds herself or himself faced with assignments that are a stretch, combined with an innate attitude that sees these duties as learning opportunities, then versatility is enhanced. Potential leaders who want to be relevant in today’s world should take note. 

An AI Bill of Rights

Often it is difficult to separate living from working. Our personal lives and professions can become intertwined such that it can seem pointless to differentiate those aspects which are personal from professional. Such is the case when considering one of today’s hottest topics, the impact of artificial intelligence. Is AI going to sway our lives in general or be mostly an employment issue? A fair prediction is that AI is going to change the landscapes of both our lives and of our work. 

As citizens and as workers we should have a strong say in what the influence of AI is going to be in our daily lives and on our jobs. The disruptive potential is too huge to leave AI development solely up to engineers and their corporate employers. If AI advancements are to be the result of free market innovation, then those of us who are future customers and recipients of its consequences should have the freedom to weigh in and heavily influence its maturation. 

A practical way to approach this challenge is through the lens of individual rights. Ever since the seventeenth century philosopher John Locke proposed the existence of fundamental natural rights, such as of life, liberty, and property, we westerners have organized our social, political, and economic institutions around the notion of personhood rights to both preserve and extend the enjoyment of our lives. We bestow upon ourselves the rights necessary to live fruitful lives free of destructive intrusion. Now is the time to apply these rights in the face of AI infiltration. 

A useful place to ground a national debate about AI’s proliferation is with the Biden Administration’s White House Office of Science and Technology Policy’s proposal known as the Blueprint for an AI Bill of Rights (https://www.whitehouse.gov/ostp/ai-bill-of-rights/). This is a thoughtful approach to identifying the key areas of contention in the planning, application, and mobilization of AI-based automated systems. 

Five principles are presented as foundational to designating what constitutes an AI Bill of Rights. To summarize: 

Safe and Effective Systems: An AI system should undergo input and testing from various sources to ensure its ability to deliver value free from the risk of malicious or unintended consequences. Humane industry standards and protective measures should apply, including the power to shut down harmful applications. Data usage is to be transparent, necessary, and respectful of personal integrity. 

Algorithmic Discrimination Protections: The biases, inequities, and discriminatory practices of people should not migrate to automated systems. Indefensible digital treatment of people based on their individual differences is to be considered unjust. Legal protections of ordinary citizens and farsighted equity assessments of intended and unintended uses of systems should be crucial in the design and deployment of AI systems. 

Data Privacy: This concern has been with us since the advent of Web 2.0. People should have ownership and agency over their data. The right to privacy is strong among free and independent people. This should be reflected in the automated systems they use. Exercising consent and having the ability to opt in and out of these systems with no restrictions should be inherent in their development. 

Notice and Explanation: It should not take a computer science degree for ordinary users to understand what they are getting into with AI systems. Clear and unambiguous language that informs operators about system functionality, intent, outcomes, updates, and risks are to be considered basic. 

Human Alternatives, Consideration, and Fallback: In short, when a user determines that an automated system has become too unwieldy or its functionality too untenable, then he or she should be able to have access to a real person to help them. No one should feel trapped within the confines of an all-powerful system they do not understand and cannot properly operate. 

These principles could become a friendly conversation starter. As citizens we need a simple tool to unify the discussion as we confront this significant challenge. This AI Bill of Rights could be it. 

Job Changing Considered

For most of us, careers are built from a series of job moves. Sure, there are those who begin a life of dedication to a particular vocation from which they never deviate. Others may find they spent their entire careers as a business founder and owner whereas others may experience an entire career employed with just one firm. However, for most of us, we will construct our careers as a migration from one opportunity to another. This necessarily involves job switching, an exercise requiring dexterity and proficiency.

There is certainly incentive to switch jobs currently. An economist at Glassdoor, Daniel Zhao, has data from the Atlanta Federal Reserve showing that job switchers have realized 7.7% wage growth since November 2022 compared to 5.5% wage growth for those who have remained in their jobs. Also, as economist Adam Blandin of Vanderbilt University points out, there are about two job vacancies for every unemployed person. And many workers know from experience that job changes are one of the best ways to enhance not just pay, but career prospects. All told, it is a suitable time to consider a job switch.

There is risk in job hopping, however. Downsides can emerge when we find ourselves in a worse situation than the one we left. In general, pitfalls occur when the new job is less stellar than we anticipated. Another snag is when the new job is less stable, as in you find yourself more exposed to layoffs. Obviously, it is important to not stumble and face regret when transitioning from one job to another. Therefore, a job switch needs careful planning. Let’s look at some of the key points to consider.

Planning for change should be deliberate. It begins with a deconstruction of your current work performance and how you have worked in recent positions. This task analysis seeks to identify those aspects of your work which energize you, bring you feelings of success and accomplishment, and align with the production metrics of your employer or target market. Conversely, being clear on those work facets which drain you of energy, leave you feeling unfulfilled, and fail to consistently meet production expectations should be revealed. Such an inventory can be converted to a plan which becomes your North Star when implementing the job shift.

Be targeted when pursuing new employment opportunities. Do your research of both the employers and the industry space they play in. Know how they fare in meeting market demand and fending off the competition. Of course, there is an assumption here that their industry is your industry and presumably you know the economic viability of your professional field. If you have not conducted a SWOT analysis in a while, now is the time to do so. Illuminate as best you can the Strengths, Weaknesses, Opportunities, and Threats inherent in your industry.

Examine potential future employers like a private investigator. Google and study company employee reviews of which there are now many, reach out on LinkedIn to employees to get their take on what it is like to work there, and leverage your own professional network to get the inside scoop. When you get job interviews, ask them questions about employee engagement, career growth prospects, employee turnover rates, and their performance review program, including the metrics they use. You are interviewing them as much as they are interviewing you.

Examine your decision-making style too. Reflectively challenge your assumptions. Assess where faulty decision making has led you astray in the past. As executive career coach Susan Peppercorn says, cognitive bias or more readily accepting information that matches your existing viewpoints, can impair quality decision making. Accept that claims made by the potential employer which sound good to you may carry hidden risks.

As they say, nothing ventured, nothing gained. But as you tread into the dicey, but conceivably rewarding world of job change, be as prepared as possible.