A recent study - “The Future of Jobs” - from The World Economic Forum (WEF), reveals that over half the global workforce will require significant up-skilling and re-skilling by 2024. This is unsurprising, given the rapid rate of technological and business change. Yet, viewing these findings alongside Harvard Business Publishing’s “2018 State of Leadership Development” report – which found that 80% of both L&D and business leaders believe organisations need more innovation in their approach to talent development.
Mindful of the momentous – and rapid – changes in the workplace over recent years, mostly thanks to advancing technology, any self-respecting manager and business leader will be anxiously watching for the next “big thing”.
‘Fake news’ – probably the key phrase of 2017 – has re-emphasised the need for ‘authenticity’ at all levels of society. This includes the world of work. Yet strategic authenticity isn’t always a desirable trait in a manager, especially if their natural behaviour is unpleasant. So says Stefan Stern, visiting professor of management practice at Cass Business School, a business journalist and a former Financial Times (FT) management columnist.
To be an effective leader requires many skills. These will be found in different proportions, depending on the sort of effective leader you are. Yet one skill area in which every effective leader – of whatever type – must excel, is communication. The first step to improving anything about yourself, including your communication skills, is to take what you already do to the highest level.
Some 84% of senior professionals believe that learning and leadership development programmes have improved their business knowledge, competencies and confidence; 83% believe that these programmes are vital to achieving business goals, and 81% say that executive education/ leadership development is more important than ever.
Issues with the Internet of Things The internet – as we now know it – has been an increasingly key part of our lives since 1995. In those 20 or so years, humanity has changed from being interested in ´mere´ hardware and software [...]
First suggested in 1999 and now a growing technological reality, the Internet of Things (IoT) is a system of interrelated computing devices, mechanical and digital machines, objects, animals or people, each of which have unique technological identifiers and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction.
There’s a tendency to do ‘just-in-case’ training rather than ‘just-in-time’ training. In other words, people are given access to learning materials in case they ever need to know what they’re being taught.