Market<\/strong>: a volatile week with the fed back in focus (rates curve reducing, good news for tech); The biggest \u201ctriple witching\u201d will happen today – so perhaps some more volatility before the end of the year. <\/p>\n\n\n\n
Portfolio<\/strong>: we started building positions in Datadog and Snowflake this week – we\u2019ve commented over the last few weeks in more stability in cloud consumption, and longer term we think AI workloads will be a significant tailwind for them. <\/p>\n\n\n\n
There is one more letter still to come from us this year which will include a look ahead into 2024. <\/p>\n\n\n\n
Nvidia shot back at AMD – the GPU debate continues <\/strong><\/p>\n\n\n\n
Portfolio view: <\/strong>we own both Nvidia and AMD and see them as very likely to share a very large and growing market in AI GPUs<\/strong> (with in house hyperscaler solutions suited for very specific internal workloads and unlikely to be adopted widely). <\/p>\n\n\n\n
Intel \u201cAI Everywhere\u201d – all about on device and edge <\/strong><\/p>\n\n\n\n
Portfolio view: <\/strong>To the extent that Intel appears to have nearly given up on the cloud GPU space shows Nvidia\u2019s clear leadership. On the CPU opportunity, we still back AMD to be the winner in CPU workloads within AI<\/strong>, where they have a significant performance leadership over Intel. We own both Nvidia and AMD (both of whom make their chips at TSMC). We don\u2019t hold Intel, though interestingly, a large part of Intel\u2019s processor (based on its Meteor Lake) will also be made at TSMC. <\/strong><\/p>\n\n\n\n
Semicap wars <\/strong><\/p>\n\n\n\n
Portfolio view: <\/strong>we still don\u2019t know when the Germany fab construction for Intel\/TSMC will start but it could still benefit the semicap equipment names order books this year (which we think were largely taken out so far this year given the uncertainty of timing). <\/strong><\/p>\n\n\n\n
Semis and geopolitics continue to be intertwined. We think we\u2019re close to peak globalisation as it relates to chip manufacturing<\/strong>. That is part of what informs ourpositive view on semicap equipment – more localised fabs means more equipment (which remains one of the drivers of our above consensus view on the semicap space)<\/strong>, even if that might make very little sense economically (TSMC\u2019s Arizona chips will cost ~40% more than its Hsinchu fabs). <\/p>\n\n\n\n
Oracle and GPU supply – \u201cGold rush\u201d<\/strong><\/p>\n\n\n\n
Portfolio view: <\/strong>Like all the hyperscalers, Oracle needs to invest in AI to keep up with competitors doing the same. We don\u2019t own Oracle but importantly for us, their capex will grow significantly next year. It\u2019s exactly what we\u2019re expecting for all of the hyperscalers (Oracle in magnitude is clearly the smallest, but the supply constraints it is seeing currently will be reflected everywhere)<\/p>\n\n\n\n
It\u2019s important for our thesis on Nvidia, AMD and TSMC – cloud players need to spend on GPU compute capacity and AI features to maintain share – that ultimately means more high-end servers and more chips. <\/strong><\/p>\n\n\n\n
Software ticking along but perhaps not meeting high AI-driven expectations.. <\/strong><\/p>\n\n\n\n
Portfolio view:<\/strong> The question for us in software and AI has been when and if AI starts to be a meaningful revenue generator for software companies, rather than the current cost of doing business<\/strong>. The bull case is that it creates more durable growth rates in software, the bear case is that it only hurts margins – all of these businesses will likely need to either increase capex to build out their own infrastructure, or buy AI compute capacity from cloud providers, which we\u2019ll likely see impact their gross margin. <\/p>\n\n\n\n
The added variable for Adobe is around its Figma acquisition Adobe<\/strong> is still trying to get its Figma acquisition through the US and European regulators and, while we hated the price paid,<\/strong> we think without it Adobe might still be at risk of abstraction\/commoditisation in an \u201cAI agent\u201d world. <\/p>\n\n\n\n
We still think Microsoft will show the most meaningful and immediate revenue opportunity – <\/strong>some news this week that the swedish municipality of Uddevalla (a town of 35,000 people) will spend $100k per year for Microsoft\u2019s Copilot.. Quite extraordinary. <\/p>\n\n\n\n
Google loses its Epic battle <\/strong><\/p>\n\n\n\n
Portfolio view:<\/strong> We own Alphabet, and though this is certainly an additional datapoint to consider when we think about the regulatory tail risk<\/strong> (for all of big tech) it\u2019s not enough for us to meaningfully change our long term returns expectations and we haven\u2019t made any changes to our position in the portfolio. <\/p>\n\n\n\n
Netflix and content arms dealers <\/strong><\/p>\n\n\n\n
Portfolio view:<\/strong> there are still plenty of bear and bull arguments out there for Netflix (we don\u2019t own any streaming players given we\u2019ve struggled a bit with the long term streaming trajectory) – but this release certainly adds credence to the bull case of Netflix as the single powerful buyer in a world where all the content owners turn into arms dealers (and – perhaps ultimately – shut down their own returns diluting streaming businesses\u2026) <\/strong><\/p>\n\n\n\n
Power hungry AI drives the need for node transition<\/strong><\/p>\n\n\n\n
Portfolio view<\/strong>: there is a bear argument around chips that Moore\u2019s law is slowing and node transitions are becoming too costly to make sense – ASML\u2019s High NA tools are ~\u20ac250m – does it become like Concorde? The best but too expensive? The clear counter to this is that node transitions – however expensive – are a necessity given power consumption and availability.<\/strong><\/p>\n\n\n\n
For enquiries, please contact:<\/strong>
Inge Heydorn, Partner, at\u00a0inge.heydorn@gpbullhound.com<\/a>
Jenny Hardy, Portfolio Manager, at\u00a0jenny.hardy@gpbullhound.com<\/a>
Nejla-Selma Salkovic, Analyst, at\u00a0nejla-selma.salkovic@gpbullhound.com<\/a><\/p>\n\n\n\n