<img alt="" src="https://secure.insightful-enterprise-intelligence.com/784283.png" style="display:none;">
Carl Schell

Channel Strategy Session: Dealers and Research and AI

Transparency and trust are vital in the artificial intelligence age

Feb 20, 2024 7:00:00 PM


Be sure to follow us on LinkedIn!


Josh Britton, President and CEO of imageOne, asked me a few weeks ago if we could provide some data to support overarching messages in his presentation to the board. He is one of my newer dealer friends, a leader who voices his opinion in an understated, meaningful way that I appreciate. Based in Michigan, the dealer has a managed print services (MPS) business that accounts for almost all its approximately $25 million in annual revenue (imageOne is ramping up its software solutions arm). As you can imagine, the deck of stats and trends we assembled was long on print but offered details in those “beyond print” areas such as artificial intelligence (AI) and digital transformation (DX), too.


This wasn’t the first time a dealer has approached us with a research request—it’s just the most recent example and it sparked a thought, albeit far from profound: Research is a huge part of what Keypoint Intelligence does regardless if on the market side (InfoCenter) or at the product level (bliQ), and it’s also a huge part of what happens in the world. Every day. You have academic research (any study from any institution), business research (inflated to “business intelligence” far too often), and consumer research (ever visited Amazon or YouTube when deciding on a product?). My favorite type of research is when I just get lost on Wikipedia educating myself on interests like Formula 1 or thrash metal.


With the proliferation of AI and the increasing use of ChatGPT, the authenticity of research has been challenged, including the analysis and thought leadership portions. Skeptics will continue to question the credibility of research and the people behind it, sometimes putting both under fire (because where there’s smoke…and I’m sure it’ll be rightfully deserved). Will accounts of plagiarism in academia rise in the era of AI? Does the definition of “plagiarism” need to be expanded? Gathering research online is convenient and artificial intelligence will only add to that factor, but I was very happy to read—during my research for this piece, ironically enough—that in-person research is making a comeback of sorts.



Academic research is funded through grants and donations. Business research is paid for by subscribers to a service, early adopters of a study, or on the backend by customers who are interested in the topic. But what of business research—ahem, intelligence—and it being caught in the crosshairs of this AI invasion that dwarfs the likes of the Chitauri offensive in The Avengers? Barry Sheinkopf, a college professor and a trusted writing advisor of mine, said to me last month that AI in higher education could cause more harm than good, but it will be a godsend in the business community. You can argue that it might turn out the other way around in the end, but my dear friend’s statement rings true to me.


There is much work to accomplish, though, even after clearing the first hurdle of learning how to properly utilize AI tools. No, the work I’m really talking about is on us humans, not artificial intelligence, to preserve the authenticity of research. It has to do with accountability, that we do not just accept what AI spits out. We must fact check. We must use prompts to improve the output and then edit it. Doesn’t matter if you’re copying and pasting numbers or words: Due diligence is not optional. My colleagues in the Publishing Group, while striving to find more effective and efficient processes with AI, believe the priority in Keypoint’s journey should be to establish policies, rules and regulations, guardrails. After all, we are the ones responsible for cleaning content, laying it out, and posting (Endgame, as we know it).


“imageOne values thoughtful insights and healthy debate, which start with a foundation of understanding data-driven research,” Britton said. “It’s easy to be misguided by incorrect or manipulative research, and the results of that could be damaging—all the more reason why it’s important to have well-vetted sources that you bring into the room for these discussions. Because when you boil it down, this is simply a trust issue between AI, the people who use the technology, the research that’s done, and the customers who ultimately buy the research.”


Did I also mention that Josh Britton can close?


Browse through our Industry Reports page (latest reports only). Log in to the InfoCenter to view research, reports, and studies on subjects galore through our Workplace- and Production-based Advisory Services. Log in to bliQ for product-level data, research, and reports. If you’re not a subscriber, or you work for a dealer and want to connect, contact us for more info by clicking here.