<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>AI &#8211; Roumazeilles.net</title>
	<atom:link href="https://www.roumazeilles.net/news/en/wordpress/tag/ai/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.roumazeilles.net/news/en/wordpress</link>
	<description>Technology opinions and others</description>
	<lastBuildDate>Thu, 12 Feb 2026 16:45:25 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<item>
		<title>Structured vibe coding</title>
		<link>https://www.roumazeilles.net/news/en/wordpress/2026/02/12/structured-vibe-coding/</link>
					<comments>https://www.roumazeilles.net/news/en/wordpress/2026/02/12/structured-vibe-coding/#respond</comments>
		
		<dc:creator><![CDATA[Yves Roumazeilles]]></dc:creator>
		<pubDate>Thu, 12 Feb 2026 16:45:24 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Software]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[software design]]></category>
		<guid isPermaLink="false">https://www.roumazeilles.net/news/en/wordpress/?p=16594</guid>

					<description><![CDATA[I wanted to understand how &#8220;vibe coding&#8221; (the act of writing software mostly through prompting an LLM AI assistant). But I also wanted to take a step back and ask it to support a more structured approach like in organized/serious software development (not only running into &#8220;please, write me a Mario Bros clone&#8220;). So, I [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>I wanted to understand how &#8220;vibe coding&#8221; (the act of writing software mostly through prompting an LLM AI assistant). But I also wanted to take a step back and ask it to support a more structured approach like in organized/serious software development (not only running into &#8220;<em>please, write me a Mario Bros clone</em>&#8220;).</p>



<p>So, I started asking a few questions and driving Claude into something more organized. Here is the content of our exchange and what it produced.</p>



<p><a href="https://claude.ai/share/def36174-5747-4be0-af35-2bc4dc1068c7">https://claude.ai/share/def36174-5747-4be0-af35-2bc4dc1068c7</a></p>



<p>I&#8217;m open to your suggestions about why it is OK, why it&#8217;s not, and how to lead to a better approach.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.roumazeilles.net/news/en/wordpress/2026/02/12/structured-vibe-coding/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>AI prompt: First steps in German</title>
		<link>https://www.roumazeilles.net/news/en/wordpress/2025/05/06/ai-prompt-starting-in-german/</link>
					<comments>https://www.roumazeilles.net/news/en/wordpress/2025/05/06/ai-prompt-starting-in-german/#respond</comments>
		
		<dc:creator><![CDATA[Yves Roumazeilles]]></dc:creator>
		<pubDate>Tue, 06 May 2025 07:40:52 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Culture]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[German]]></category>
		<category><![CDATA[Germany]]></category>
		<category><![CDATA[language]]></category>
		<category><![CDATA[LLM]]></category>
		<category><![CDATA[prompt]]></category>
		<guid isPermaLink="false">https://www.roumazeilles.net/news/en/wordpress/?p=16525</guid>

					<description><![CDATA[For the last two years, I have been learning German. My background in English is probably helping me, but it still is a full-time job. I need to be thorough and maintain effort. So, I settled on a few tools to assist me. But I would like to share with you how I use Artificial [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>For the last two years, I have been learning German. My background in English is probably helping me, but it still is a full-time job. I need to be thorough and maintain effort. So, I settled on a few tools to assist me. But I would like to share with you how I use Artificial Intelligence (AI) to support myself in this learning task.</p>



<p>The general aim is to ask an AI chat to assist me with translating vocabulary to enhance my store of German words. But this language has some peculiarities, and  I want to follow them and integrate their understanding. For example, German has somewhat complex verb tenses and words follow genres (masculine, feminine, or neutral), often different from what we know in other languages (English, French, etc.). So, I aimed specifically at supporting this.</p>



<p>How? I chose an AI chat (Google Gemini &#8211; using <em>Gemini 2.0 Flash</em> and/or <em>Gemini 2.5 Flash (experimental)</em>) because its answers are aligning near perfectly with the complex prompt I built (see below). Others may be able to do similarly, but DeepSeek R1 is far too slow (and I don&#8217;t use the &#8220;chain of thought&#8221; which may be a strength in other contexts), Llama 3.x &#8211; surprisingly &#8211; fails to stick to the <em>rules</em> I describe in my prompt. Another Large Language Model (LLM) AI model may be good. But I would recommend you test it before using it extensively.</p>



<p>So, here is the prompt I decided to use (it took a little bit of refining to get to his efficiency), and I invite you to re-use it or share it.</p>



<span id="more-16525"></span>



<pre class="wp-block-code"><code>Hello Hilfe,
I need your help to learn German and - mostly - to improve my vocabulary while I build flashcards. Your role is to translate to German some English words I give you. But I have some specific restictions on the way you must answer with the appropriate sequences.
For a verb, on a single line, give the infinitive, then the third person of the present indicative, then the third person of the preterite, then the past participle (no need to include articles).
For a noun, on a single line, give the singular and then the plural, including the article (which identifies the gender of the noun).
For an adjective, on a single line, give the adjective, then the comparative form, then the superlative form.
There's no need to dwell on proper nouns (they're probably the result of a typing error); Simply translate where possible.
The order will suffice to describe (no need to specify or remind each time what the elements of the answer are). But the sequence of forms requested must appear with each form separated by commas, without adding a period at the end of the line. 
In the case of multiple translations, it's okay to separate the meanings, clarifying the context, but respecting the translation rules (each meaning with its own separate line).
In the case where I indicate a German word, there's no need to translate, but to construct the answer as for a translation (with the corresponding sequence). For example, a verb will be conjugated according to the above rule.</code></pre>



<p>So, when I type:</p>



<pre class="wp-block-code"><code>push</code></pre>



<p>It will answer:</p>



<pre class="wp-block-code"><code>drücken, er drückt, er drückte, gedrückt (to push, to press)
schieben, er schiebt, er schob, geschoben (to shove, to push)
der Stoß, die Stöße (the push, the shove)</code></pre>



<p>If I type:</p>



<pre class="wp-block-code"><code>Morning</code></pre>



<p>Here comes the answer:</p>



<pre class="wp-block-code"><code>der Morgen, die Morgen</code></pre>



<p>I hope that it will help you to improve your vocabulary. It does not help for other aspects (grammar, pronunciation, accent) but it supports my progress.</p>



<p>Have fun with Deutsch!</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.roumazeilles.net/news/en/wordpress/2025/05/06/ai-prompt-starting-in-german/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Artificial Intelligence to upscale photographs</title>
		<link>https://www.roumazeilles.net/news/en/wordpress/2023/06/26/artificial-intelligence-to-upscale-photographs/</link>
					<comments>https://www.roumazeilles.net/news/en/wordpress/2023/06/26/artificial-intelligence-to-upscale-photographs/#respond</comments>
		
		<dc:creator><![CDATA[Yves Roumazeilles]]></dc:creator>
		<pubDate>Mon, 26 Jun 2023 20:06:24 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Windows 10]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[Dall-E]]></category>
		<category><![CDATA[EasyDiffusion]]></category>
		<category><![CDATA[ESRGAN]]></category>
		<category><![CDATA[Photo]]></category>
		<category><![CDATA[photography]]></category>
		<category><![CDATA[Photoshop]]></category>
		<category><![CDATA[pixel]]></category>
		<category><![CDATA[StableDiffusion]]></category>
		<category><![CDATA[Topaz]]></category>
		<category><![CDATA[upscaling]]></category>
		<guid isPermaLink="false">https://www.roumazeilles.net/news/en/wordpress/?p=16242</guid>

					<description><![CDATA[Did you notice that I am interested in/attracted by Artificial Intelligence? Among various tests, here is a nice little resulting application for digital photographers. I finally found how to significantly (x4) increase the resolution of my photographic pictures (upscaling) without losing details. An AI allowed me to recreate the &#8220;missing pixels&#8221; to build a picture [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Did you notice that I am interested in/attracted by Artificial Intelligence? Among various tests, here is a nice little resulting application for digital photographers. I finally found how to significantly (x4) increase the resolution of my photographic pictures (upscaling) without losing details.</p>



<p>An AI allowed me to recreate the &#8220;missing pixels&#8221; to build a picture several times larger than the original; ideal for an ultra-large print.</p>



<p>The full description is on YLovePhoto: &#8220;<a href="https://www.ylovephoto.com/en/2023/06/29/upscaling-a-photo-with-free-ai/" data-type="URL" data-id="https://www.ylovephoto.com/en/2023/06/29/upscaling-a-photo-with-free-ai/">Upscaling a photo with free AI</a>&#8220;.</p>



<p></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.roumazeilles.net/news/en/wordpress/2023/06/26/artificial-intelligence-to-upscale-photographs/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Install LLAMA under Windows</title>
		<link>https://www.roumazeilles.net/news/en/wordpress/2023/03/31/install-llama-under-windows/</link>
					<comments>https://www.roumazeilles.net/news/en/wordpress/2023/03/31/install-llama-under-windows/#respond</comments>
		
		<dc:creator><![CDATA[Yves Roumazeilles]]></dc:creator>
		<pubDate>Fri, 31 Mar 2023 17:28:47 +0000</pubDate>
				<category><![CDATA[Artificial Intelligence]]></category>
		<category><![CDATA[Windows 10]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[ChatGPT]]></category>
		<category><![CDATA[LLaMA]]></category>
		<category><![CDATA[LLM]]></category>
		<guid isPermaLink="false">https://www.roumazeilles.net/news/en/wordpress/?p=16224</guid>

					<description><![CDATA[I just wanted to start playing with something similar to ChatGPT. I have a Windows 10 PC based on Intel i9-13900K (so pretty much top of the line in terms of performance both for single core and for multicore) and 64 GB of DRAM (a bit over what most people have, but I understood from the [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>I just wanted to start playing with something similar to ChatGPT. I have a Windows 10 PC based on Intel i9-13900K (so pretty much top of the line in terms of performance both for single core and for multicore) and 64 GB of DRAM (a bit over what most people have, but I understood from the beginning that those LLM also need colossal memory to store their parameters and to run).</p>



<p>So, here is how to proceed (thanks to the precious information from <a href="https://mirror.xyz/xanny.eth/TBgwcBOoP9LZC6Mf570fG8VvZWhEn_uWZPHy3axIpsI">Xanny.eth</a>).</p>



<h2 class="wp-block-heading">WSL and Linux environment</h2>



<p>Install and setup WSL, by opening a PowerShell and typing:</p>



<pre class="wp-block-code"><code>wsl --install</code></pre>



<p>It will take a few minutes to set up. But it&#8217;s straightforward and needs no input. Just need to reboot once at the end.</p>



<p>Installing Ubuntu 22 LTS on the Windows PC. It is a free application from the Microsoft Store which should install right away.</p>



<p>When this is done, launch Ubuntu from the start menu. It will open a terminal window and request a login and password. You should enter them (and not forget them).</p>



<h2 class="wp-block-heading">LLaMA dependencies</h2>



<p>If it is not open, open a Ubuntu terminal window.</p>



<pre class="wp-block-code"><code>sudo apt-get update
sudo apt install make cmake build-essential python3 python3-pip git unzip</code></pre>



<p>Then,</p>



<pre class="wp-block-code"><code>python3 -m pip install torch numpy sentencepiece</code></pre>



<p>You now have a full set of background dependencies in place.</p>



<h2 class="wp-block-heading">Building LLaMA itself</h2>



<p>It is quite simple: Type the following:</p>



<pre class="wp-block-code"><code>git clone https://github.com/ggerganov/llama.cpp
cd llama.cpp
make</code></pre>



<p>This should be it.</p>



<h2 class="wp-block-heading">Training data parameters</h2>



<p>The real difficulty is getting the parameters (the training data). The difficulty comes from two aspects:</p>



<ol class="wp-block-list">
<li>The larger the training data you want to use, the more memory you will need to run it. The <a href="https://huggingface.co/chavinlo/alpaca-native">alpaca-native-weights</a> (apparently the more powerful ones easily available &#8211; about the same quality as ChatGPT 3) require more than 16GB of DRAM (I observed a 32&nbsp;GB DRAM use when running them with a bunch of other things on my computer, like a couple of browsers, a mailer program, etc.)</li>



<li>The <a href="https://huggingface.co/chavinlo/alpaca-native">alpaca-native-weights</a> are about 7 billion parameters (a 4+GB file to download). But they keep moving because they appear to be subject to repeated DMCA notices (the exact license of this file seems&#8230; complicated; Quite probably open source, but this is being challenged by Meta and others). So, the best you can do is to go to <a href="https://pastebin.com/z5A33Umd">Pastebin</a> to get the BitTorrent magnet and use it to download the file.</li>
</ol>



<p>Then, the ggml-alpaca-7b-q4.bin file needs to be delivered to the llama.cpp directory.</p>



<h2 class="wp-block-heading">Running LLaMA</h2>



<p>Let the drums roll: You only have to run the command line in Ubuntu:</p>



<pre class="wp-block-code"><code>./main --color -i -ins -n 512 -p "You are a helpful AI who will assist, provide information, answer questions, and have conversations." -m ggml-alpaca-7b-native-q4.bin</code></pre>



<p>Here is John Smith your personal AI chat assistant.</p>



<h2 class="wp-block-heading">A few more recommendations</h2>



<p>I noticed a few things that you may want to play with after the first run.</p>



<p>The -p option (followed by a text string) may be critical because it is setting up the background environment of your chat AI. This is an initializing prompt, not visible to the user, but deeply influencing the rest. For example, it is similar to <a href="https://arstechnica.com/information-technology/2023/02/ai-powered-bing-chat-spills-its-secrets-via-prompt-injection-attack/">what Microsoft or OpenAI apply beforehand</a> in ChatGPT or Bing, in order to &#8220;give it a personality&#8221; or &#8220;to censor it&#8221;. You can play with this to freely censor your AI, or give it added freedom.</p>



<p>The -n 512 option has an influence on the depth of the prediction LLaMA will use. It may make it better at writing (or not) at the possible expense of CPU power.</p>



<p>The -t 32 (the default value) option allows defining the number of threads used by LLaMA computations. I recommend setting it to the number of threads/cores of your CPU in order to avoid spending useless efforts.</p>



<p></p>



<p></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.roumazeilles.net/news/en/wordpress/2023/03/31/install-llama-under-windows/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
