The human touch, digital twins, and swiveling laptop screens

Estimated read time 4 min read


gettyimages-1496988192

Andriy Onufriyenko/Getty Images

Welcome to ZDNET’s Innovation Index, which identifies the most innovative developments in tech from the past week and ranks the top four, based on votes from our panel of editors and experts. Our mission is to help you identify the trends that will have the biggest impact on the future. 

For once, a non-AI-centric technology — digital twins — is making waves in the job sphere. Speaking of which, research finds people still have something important to offer in the workplace. 

In first place this week is a study from Canva that determined human creativity will still have value at work in the age of AI. A sample of hiring managers, educators, and recent grads confirm that creativity — defined in the study as “the ability to use your imagination to express yourself or your ideas, solve problems, or create something new” — is essential to career success, and can’t be replicated by AI (one point: humans). The survey findings add to a growing consensus that, at least in the short-term, AI’s ability to automate lower-level tasks will not only put human creativity at a premium, but leave us with more time to devote to it, too. The main takeaway? Creativity can be taught, and early. From kindergarten to college, Canva’s study says curriculums should invest in nurturing student creativity to prepare the next generation of professionals. Here’s how.

Also: 1 in 3 workers are using AI multiple times a week – and they’re shouting about it

ZDNET Innovation Index

ZDNET

At #2 is how digital twins could revolutionize drug development. Digital twins, or computer simulations of an object or environment that update dynamically with data, are a mainstay in engineering, but have yet to improve life sciences — until now. By replicating time- and resource-consuming processes like testing and clinical trials digitally, early data suggests these twins can speed up drug development and keep costs down. Currently, a drug takes an average of 10 years and $3 billion to materialize, only to fail 96% of the time. If advanced further, this data-led approach could exponentially ramp up the drug development process while streamlining medical costs. As Unlearn AI founder Charles Fisher told ZDNET Contributor Tiernan Ray, digital twins could “turn medicine into a predictive science” that optimizes treatments, which could theoretically pave the way for personalized medicine in the future. 

In third place is Intel, which launched its Core Ultra 200V, a new and impressive nine-piece processor series. As ZDNET’s Cesar Cadenas noted, the laptops are meant for work and everyday use but are anticipated to pack a punch in speed and longevity. Intel reported that one processor has a battery life of 14 hours off a single charge, beating out competitor Qualcomm’s Snapdragon X Elite. Keeping pace with the influx of AI-ready hardware across consumer electronics, Intel says the new chip goes “beyond what people expect” for a lightweight laptop. We’ll have to see how the devices stack up against all the other IFA-debuted laptops our experts will test when they arrive later this month. 

Closing out the week are digital twins, once again — specifically how they could transform product development across nearly every industry, with the help of extended reality (XR). So far, AR and VR haven’t completely lived up to entertainment expectations, nor have they been widely adopted. As our resident experts have said, the Apple Vision Pro needs work (though it does have accessibility applications). But when it comes to business and industrial environments, XR makes global collaboration, testing, and rendering complex designs cheaper and more efficient. Combined with digital twins, XR can make it possible to try out new operations before the physical machinery even exists, lowering risks and speeding up development cycles. From consumer tech to planes to skyscrapers, wider adoption of this dynamic duo could revolutionize manufacturing (and it’s not even AI).  





Source link

You May Also Like

More From Author

+ There are no comments

Add yours