<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[ClearSignals]]></title><description><![CDATA[Cutting through the noise with curiosity about where AI is taking us — and what we lose or gain along the way.]]></description><link>https://ai.clearsignals.ca</link><generator>Substack</generator><lastBuildDate>Sun, 12 Apr 2026 15:36:27 GMT</lastBuildDate><atom:link href="https://ai.clearsignals.ca/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[ClearSignals]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[Info@clearsignals.ca]]></webMaster><itunes:owner><itunes:email><![CDATA[Info@clearsignals.ca]]></itunes:email><itunes:name><![CDATA[ClearSignals]]></itunes:name></itunes:owner><itunes:author><![CDATA[ClearSignals]]></itunes:author><googleplay:owner><![CDATA[Info@clearsignals.ca]]></googleplay:owner><googleplay:email><![CDATA[Info@clearsignals.ca]]></googleplay:email><googleplay:author><![CDATA[ClearSignals]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Am I Learning, or Is It an Illusion?]]></title><description><![CDATA[How close can AI get you?]]></description><link>https://ai.clearsignals.ca/p/am-i-learning-or-is-it-an-illusion</link><guid isPermaLink="false">https://ai.clearsignals.ca/p/am-i-learning-or-is-it-an-illusion</guid><dc:creator><![CDATA[ClearSignals]]></dc:creator><pubDate>Tue, 24 Mar 2026 04:05:27 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!4VPo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!4VPo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!4VPo!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png 424w, https://substackcdn.com/image/fetch/$s_!4VPo!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png 848w, https://substackcdn.com/image/fetch/$s_!4VPo!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png 1272w, https://substackcdn.com/image/fetch/$s_!4VPo!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!4VPo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png" width="728" height="364" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/76b38760-f476-4534-910d-12e5ba127397_3392x1696.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:728,&quot;width&quot;:1456,&quot;resizeWidth&quot;:728,&quot;bytes&quot;:6485812,&quot;alt&quot;:&quot;The AI Wall&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://ai.clearsignals.ca/i/191214813?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:&quot;center&quot;,&quot;offset&quot;:false}" class="sizing-normal" alt="The AI Wall" title="The AI Wall" srcset="https://substackcdn.com/image/fetch/$s_!4VPo!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png 424w, https://substackcdn.com/image/fetch/$s_!4VPo!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png 848w, https://substackcdn.com/image/fetch/$s_!4VPo!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png 1272w, https://substackcdn.com/image/fetch/$s_!4VPo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F76b38760-f476-4534-910d-12e5ba127397_3392x1696.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"></figcaption></figure></div><h3>Source: Gen AI Won&#8217;t Make Your Employees Experts</h3><p><a href="https://hbr.org/2026/03/gen-ai-wont-make-your-employees-experts">Published by Harvard Business Review in March 2026</a> </p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://ai.clearsignals.ca/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://ai.clearsignals.ca/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p>The title of a recent HBR piece had me stop and think: <em><strong>Gen AI Won't Make Your Employees Experts.</strong></em> Whoa... What!? Why? Frankly, I've been using Gen AI tools for about a year and I feel as though I've been learning a lot more quickly these past 12 months. I won't summarize the entire article for you, but I do recommend that you give it a quick read.</p><p>I'll cut to the chase... the article is built around a study from researchers at Stanford University and Harvard Business School's Digital Data Design Institute. It's a working paper, so not yet peer-reviewed, but well-designed. The study introduces what researchers call "the AI wall" &#8212; the hard limit on how much generative AI can help people complete tasks outside their area of expertise.</p><div class="pullquote"><p><strong>Q: What makes someone an &#8220;expert,&#8221; anyway?</strong></p></div><p>Basically, the study is showing that success is dependent on the distance you are from the expertise you&#8217;re engaging in. That&#8217;s an interesting concept. As I&#8217;ve said before, I&#8217;m no professional writer, and I don&#8217;t have a desire to be one, but I do have a desire to get my ideas organized enough for you to understand them.</p><div class="pullquote"><p><strong>Q: If I&#8217;m using Gen AI to help organize my thoughts and write them into an article cohesively, am I not learning how to write effectively in the process?</strong></p></div><p>Well, I can see where learning could be encouraged and where it might not, especially if you just accept what&#8217;s generated for you without thinking about it, or challenging it. If you&#8217;ve come across the recent conversation about <a href="https://hbr.org/2025/09/ai-generated-workslop-is-destroying-productivity">&#8220;AI workslop,&#8221;</a> you&#8217;re probably already connecting some dots here. I&#8217;m filing that one away for later.</p><p>ClearSignals is partly a result of Gen AI usage. I&#8217;m asking questions about AI technology research without much formal training or experience in this style of writing. The tool is taking my unstructured thoughts, organizing them, and showing me how the mechanics need to apply to achieve my goal. The result feels like an improved version of what I&#8217;ve drafted. Here&#8217;s where my experience starts to diverge from what the study measured. I&#8217;ve created a feedback loop: I draft something, the tool restructures it, I ask why it made the changes, and I try to do better next time. I&#8217;m getting better each time, right? Or am I?</p><div class="pullquote"><p><em><strong>Q: Am I learning, or is the tool giving me the illusion that I am?</strong></em></p></div><p>Oh, no&#8230; is Gen AI creating a crutch that I can&#8217;t live without anymore? If I&#8217;m truly learning something, I should be able to apply what I&#8217;ve learned without the tool. The working paper behind this article examined how the partnership enhances expertise, not what happens when it&#8217;s taken away. That thought sits with me for a minute.</p><p>I keep coming back to something though. We&#8217;ve always invented tools to advance our capabilities. Is this really any different?</p><div><hr></div><p>Walking through the evidence might help me think this through. The article explains that foundational knowledge is required because you need enough domain knowledge to know whether the generated responses are good or bad. The study tested three groups by asking them to write a web article, a task normally done by expert writers. When conceptualizing the writing task, all groups scored closely together and much higher than without Gen AI. But when actually completing the writing, the results split. The expert writers improved their score with Gen AI. The marketing specialists, who were closer to the writing domain, nearly matched the experts. The technology specialists, who were furthest from the expertise, barely improved at all. They couldn&#8217;t tell the good from the bad, so naturally they hit the wall.</p><div class="pullquote"><p><em><strong>Q: How do we break down the wall?</strong></em></p></div><p>So, the non-experts didn&#8217;t completely fail. In fact the marketers scored almost as well as the expert writers with the use of AI, and the study attributes this to their decreased distance from the expert capabilities compared to the technology specialists. And the technology specialists, who were most distant from the expertise, saw almost no change at all. Here&#8217;s what it looked like.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9r9V!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9r9V!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png 424w, https://substackcdn.com/image/fetch/$s_!9r9V!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png 848w, https://substackcdn.com/image/fetch/$s_!9r9V!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png 1272w, https://substackcdn.com/image/fetch/$s_!9r9V!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9r9V!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png" width="750" height="334" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:334,&quot;width&quot;:750,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:37425,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://ai.clearsignals.ca/i/191214813?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9r9V!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png 424w, https://substackcdn.com/image/fetch/$s_!9r9V!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png 848w, https://substackcdn.com/image/fetch/$s_!9r9V!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png 1272w, https://substackcdn.com/image/fetch/$s_!9r9V!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff58f97e2-7654-4df8-99c8-6eed8a450065_750x334.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="pullquote"><p><em><strong>Q: What if we just keep at it and use AI more, instead of less?</strong></em>  </p></div><p>That question speaks to my nature a little. I've never been one to leave well enough alone. If you tell me not to do something I'm pretty much going to do it anyway. That instinct has served me well sometimes and gotten me in trouble other times. But I like to think of it as challenging the norm to seek something better. Is the intuitive path actually the incorrect path? What if we used these AI tools like an exercise machine? Could we keep using them with a keen mind and healthy skepticism and question what's being put in front of us?</p><div class="pullquote"><p><em><strong>Q: If we kept pushing against the wall, would it eventually move?</strong></em></p></div><p>One thing that keeps pulling at me is what happened to the group furthest from the expertise. The technologists didn&#8217;t just struggle with the writing. They actively discarded parts of the generated content that they should have kept and kept parts they should have thrown out. They didn&#8217;t know what they didn&#8217;t know. That&#8217;s the wall.</p><p>But here&#8217;s what I keep coming back to. When you don&#8217;t know what to ask, why not just ask the tool what you should be asking? Wouldn&#8217;t it give you some ideas?</p><p>Maybe. But that assumes you know you&#8217;re missing something. The technologists in this study weren&#8217;t lost. They were confident. They edited the AI&#8217;s output through the lens of their own expertise, and for their domain, they were right. For this task, they were wrong. They weren&#8217;t ignoring the tool. They were overriding it with expertise that didn&#8217;t apply.</p><p>But if the technologists were faced with the test scores, would they have asked why they didn&#8217;t improve? I&#8217;m confident that I would have. Receiving feedback that I discarded the correct information would bother me enough to adjust. Remember, the working paper wasn&#8217;t designed to measure whether repeated practice would change the outcome, so we won&#8217;t know if they would have, but the question that doesn&#8217;t stop repeating in my head is, what if they did?</p><div class="pullquote"><p><em><strong>Q: What if we used Gen AI as collaborators to check our thinking? </strong></em></p></div><p>The technologists used Gen AI to generate content. They didn&#8217;t use it to challenge their own assumptions. Is this the difference between AI workslop and finding the value in Gen AI tools? It&#8217;s a completely different kind of partnership, and it&#8217;s one the study didn&#8217;t test. Is it helpful to have this kind of thinking partner with you all the time, and does it help you learn more quickly?</p><div><hr></div><p>I started this article because an HBR title caught my eye and made me a little defensive. <em><strong>Gen AI Won't Make Your Employees Experts.</strong></em> My gut reaction was to push back, because my own experience felt like evidence to the contrary.</p><p>But after sitting with the research, I&#8217;m less sure. The study shows that the wall is real for people who are far from the expertise. And it shows that even people close to the expertise need foundational knowledge to make the partnership work. I think I have that foundation when it comes to understanding AI and technology. But when it comes to writing? I&#8217;m honestly not sure where I stand.</p><p>This article was written with Gen AI. And I think I&#8217;ll continue to get better as I keep using it. But I&#8217;m also the one judging that, which brings me right back to where I started.</p><div class="pullquote"><p><em><strong>Q: Am I learning, or is it an illusion?</strong></em></p></div><p>I still don&#8217;t know. But I think it&#8217;s still worth running towards the technology to find out rather than running the other way.</p><p>-Anthony /CS</p><div><hr></div><p style="text-align: center;"><br>Something still sitting with you? </p><p style="text-align: center;">The best questions don't arrive with answers. </p><p style="text-align: center;">If one followed you out of this article, let me know about it.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://ai.clearsignals.ca/survey/6570376&quot;,&quot;text&quot;:&quot;Leave me a question&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://ai.clearsignals.ca/survey/6570376"><span>Leave me a question</span></a></p><div><hr></div><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://ai.clearsignals.ca/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Come for the questions. Stay for the uncertainty.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Why I Started Asking Questions About AI... ]]></title><description><![CDATA[And Why I Can&#8217;t Stop...]]></description><link>https://ai.clearsignals.ca/p/why-i-started-asking-questions-about</link><guid isPermaLink="false">https://ai.clearsignals.ca/p/why-i-started-asking-questions-about</guid><dc:creator><![CDATA[ClearSignals]]></dc:creator><pubDate>Sun, 15 Mar 2026 05:51:59 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!pCAq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pCAq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pCAq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg 424w, https://substackcdn.com/image/fetch/$s_!pCAq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg 848w, https://substackcdn.com/image/fetch/$s_!pCAq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!pCAq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pCAq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg" width="728" height="200" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/edb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:400,&quot;width&quot;:1456,&quot;resizeWidth&quot;:728,&quot;bytes&quot;:324579,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://ai.clearsignals.ca/i/189097207?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!pCAq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg 424w, https://substackcdn.com/image/fetch/$s_!pCAq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg 848w, https://substackcdn.com/image/fetch/$s_!pCAq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!pCAq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fedb76d5d-58a3-422f-952e-57e3e9f4e614_1456x400.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><p>I think it&#8217;s safe to say I&#8217;ve been fascinated by technology for my entire life. I was interested in computers from the moment I got my hands on one. As of today, I&#8217;ve spent more than twenty years working in technology. I studied it, built a career in it, it enabled me to access my own capabilities and I&#8217;ve genuinely loved how it made me feel. My core belief about technology has always been simple: it enables people to be better at whatever they engage in.</p><p>About a year ago, I started noticing something in the people closest to me. Friends. Family. People I&#8217;ve known for years. When AI came up in conversation &#8212; and it was coming up everywhere &#8212; I kept hearing the same themes: fear. Skepticism. Rejection.</p><p>I was surprised and a little confused. Then, honestly, I was frustrated.</p><p>Here I was, someone who had built their identity around technology and its potential, listening to people I care about recoil from something I saw as fundamentally hopeful. I'd think: don't we want to cure cancer more quickly? My instinct was to engage in conversation and debate the other side. To make the case. To prove that this was another step forward in a long line of steps that had already delivered things we never thought possible in our lifetime.</p><p>So I started paying closer attention.</p><p>And the more I paid attention, the more complicated it got.</p><p>I saw things that reinforced my optimism. I also saw incidents that gave me pause. I watched some of the people who built these systems start to publicly raise concerns. Not fringe voices, but researchers and founders who had spent their careers on this technology. I watched the discourse swing wildly between utopia and apocalypse, often within the same news cycle.</p><p>I realized I wasn&#8217;t frustrated with the people in my circle anymore. I was confused myself.</p><p>I&#8217;ll be honest about something: I came into this wanting to prove a point. The evidence wouldn&#8217;t cooperate. That turned out to be more interesting.</p><p>That confusion is what led me here.</p><div><hr></div><p>The more I paid attention, the more I noticed that the conversation itself had a problem. Whether someone was excited, terrified, or just exhausted by the topic, most people &#8212; myself included, at first &#8212; were more invested in defending a position than in actually examining one. That realization is what changed my approach. Instead of trying to be right, I decided to try to be honest about what I actually didn&#8217;t know.</p><p>So I&#8217;m trying to stop endlessly conversing about this and build something different &#8212; a place where I can follow the evidence honestly, sit with uncertainty, and keep asking questions past the point where most conversations stop.</p><div><hr></div><p>I&#8217;m not a researcher, or an academic. And I&#8217;m certainly not a journalist. I&#8217;m a technology professional who has spent two decades helping organizations understand and apply technology to create success. I know how to ask hard questions, push past the surface narrative, and look for what's actually true. While I&#8217;m also interested in how AI does what it does under the hood, there are many much more brilliant technologists exploring those topics with much success. The area of AI that I feel I&#8217;m best positioned to explore and contribute to is the impact it has on people. You, me, and all of our loved ones. The most important people.</p><div><hr></div><p>ClearSignals is my attempt to do that.</p><p>The name comes from the problem. There&#8217;s an enormous amount of noise in the AI conversation &#8212; hype, fear, marketing dressed as insight, opinion dressed as evidence. I wanted a place to cut through it. To look at what we actually know, what remains genuinely uncertain, and what questions are worth sitting with even when they don&#8217;t have clean answers.</p><p>The focus here isn&#8217;t AI as a tool. It&#8217;s AI as a force &#8212; one that&#8217;s already touching how we work, how we think, how institutions function, and how we understand ourselves as human beings. I want to examine that force seriously, without pretending I have all the answers, and without necessarily trying to prove anything. That means asking what this does to how we work, how we think, and what we believe we&#8217;re capable of &#8212; not just what it does to our economics.</p><p>You don&#8217;t need a technical background to read this. You do need an open mind and some patience for nuance.</p><div><hr></div><p>A few things worth knowing about how I work:</p><p>If you&#8217;ve read this far you already know that I have a passion and curiosity for technology, so what&#8217;s an AI writing project that&#8217;s written by a person who loves technology if I didn&#8217;t use AI to help me write? I want to be transparent about that, because this type of writing has always been a weak spot for me and while I&#8217;ll let you decide if I&#8217;m succeeding at doing it in a passable way, one thing is for certain &#8212; an AI tool has given me the confidence to do this.</p><p>But this is exactly the tension I want to explore. The ideas here are mine. The questions come from my own genuine confusion and curiosity. What&#8217;s happening is this: I express my thoughts, then use an AI tool to take those raw thoughts and focus them &#8212; put the bloody comma in the right place (maybe) &#8212; and surface a note or two that lets me come through more clearly for you.</p><p>Whether that&#8217;s augmentation or something more complicated &#8212; that&#8217;s one of the questions we&#8217;ll wrestle with here together.</p><div><hr></div><p>The posts here will range &#8212; some examining specific findings, others grappling with value systems, some raising questions and leaving them open. I&#8217;m following the evidence and the curiosity, trying not to land anywhere too specific. The hope is that we navigate this together.</p><p>If that sounds like something worth following, come along for the ride.</p><p>&#8212; Anthony /cs</p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://ai.clearsignals.ca/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Come for the questions. Stay for the uncertainty.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://ai.clearsignals.ca/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share ClearSignals&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://ai.clearsignals.ca/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share ClearSignals</span></a></p><p></p>]]></content:encoded></item></channel></rss>