<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>GUI &#8211; Xojo Programming Blog</title>
	<atom:link href="https://blog.xojo.com/tag/gui/feed/" rel="self" type="application/rss+xml" />
	<link>https://blog.xojo.com</link>
	<description>Blog about the Xojo programming language and IDE</description>
	<lastBuildDate>Tue, 30 Jan 2018 15:17:28 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>Supporting Multiple Cores</title>
		<link>https://blog.xojo.com/2018/01/25/supporting-multiple-cores/</link>
		
		<dc:creator><![CDATA[Norman Palardy]]></dc:creator>
		<pubDate>Thu, 25 Jan 2018 20:13:00 +0000</pubDate>
				<category><![CDATA[Tips]]></category>
		<category><![CDATA[Console]]></category>
		<category><![CDATA[CPU]]></category>
		<category><![CDATA[GUI]]></category>
		<category><![CDATA[Threads]]></category>
		<guid isPermaLink="false">https://blog.xojo.com/?p=3832</guid>

					<description><![CDATA[With today's multi-core CPU's it seems that an application made with Xojo running on a single core is somewhat restricting. If you have a lot of data to process, large images to manipulate or other things that could happen in the background, it would seem that with a multi-core machine you could do this faster "if only Xojo would make threads preemptive". If you think you need preemptive threads today, try the helper application approach and I think you'll be pleasantly surprised at how well it works.]]></description>
										<content:encoded><![CDATA[<p>With today&#8217;s multi-core CPU&#8217;s it seems that an application made with Xojo running on a single core is somewhat restricting. If you have a lot of data to process, large images to manipulate or other things that could happen in the background, it would seem that with a multi-core machine you could do this faster &#8220;if only Xojo would make threads preemptive&#8221;. We get a lot of requests for preemptive threads so that people can take advantage of multiple cores.</p>
<p><span id="more-3832"></span></p>
<p>It&#8217;s been suggested that this should be <em>easy</em> to do. Just make threads preemptive (so that they will then run on any available core) and voila! Unfortunately, it&#8217;s not that simple. Let&#8217;s look at why the current threads are not preemptive. It&#8217;s hard for you, the developer, to work with preemptive threads. Some languages, like Java, have language features built in to them to try and help make this less work, but it is still hard to get right and very hard to debug when you don&#8217;t. Much of the framework would need updating to be thread safe and your application&#8217;s user interface is not thread safe on any platform because the operating systems themselves don&#8217;t have thread-safe user interface code. If you access something from a pre-emptive thread and that something is not thread-safe, there&#8217;s a very good chance your application is going to crash. We have had to go to a lot of extra work just to make the threads you have today work without causing problems.</p>
<p>The Xojo language already has functions like mutex and semaphores that help you make a multi-threaded program. But you have to protect every place you might set a value that could be shared by many threads. This would mean protecting any globals and pretty much anything that is not local to the method or the thread being executed. That&#8217;s very different than what you have to do today and a lot more work. It&#8217;s just not <em>simple</em> or <em>easy to use</em> the way most of Xojo is designed to be.</p>
<p>The end goal is to use all cores and thereby make your software more responsive, faster, or able to handle more data, or do more things all at the same time. There&#8217;s been a way to do this for as long as Xojo has supported building console applications. The design is to create a main application (GUI or not) and use other helper applications to run separate tasks. The main application and the helpers can communicate in any one of several ways: IPCSockets, TCPSockets, UDPSockets, XML, files, or just about any other way you can dream of. The upside to this way of solving the problem is you can design and implement the main application and the helpers independently and use the debugger to debug each independently. You can use any data in either program in the same way you always have. You don&#8217;t have to worry about the framework being thread safe as the helper and main application run as completely separate processes with their own memory space. Most importantly, you can do this today.</p>
<p>I&#8217;m not going to say it&#8217;s simple. You have to think about what portion of your application can be segmented out into a helper console application. You have to design the communications between the main and helper applications. You have to write, test and debug them. But you don&#8217;t have to worry about variables changing mysteriously because some other thread changed the value. You don&#8217;t have to use lots of mutexs or sempahores to block other threads from altering things when you least expect them. And you can use the entire framework that is available to console applications. Last but not least, you can run as many instances of these helper applications as your computer (or all the computers available to you) can run.</p>
<p>If you think you need preemptive threads today, try the helper application approach and I think you&#8217;ll be pleasantly surprised at how well it works.</p>
<p>For more information: <a href="https://blog.xojo.com/2013/07/26/take-advantage-of-your-multi-core-processor/">Take advantage of your multi-core processor</a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>There and Back Again: The Evolution of the Graphical User Interface</title>
		<link>https://blog.xojo.com/2014/07/29/there-and-back-again-the-evolution-of-the-graphical-user-interface/</link>
		
		<dc:creator><![CDATA[Geoff Perlman]]></dc:creator>
		<pubDate>Tue, 29 Jul 2014 00:00:00 +0000</pubDate>
				<category><![CDATA[Mac]]></category>
		<category><![CDATA[Technology]]></category>
		<category><![CDATA[Windows]]></category>
		<category><![CDATA[Apple]]></category>
		<category><![CDATA[GUI]]></category>
		<category><![CDATA[Mobile]]></category>
		<category><![CDATA[UI]]></category>
		<guid isPermaLink="false">http://blogtemp.xojo.com/2014/07/29/there-and-back-again-the-evolution-of-the-graphical-user-interface/</guid>

					<description><![CDATA[Over the past 30 years, the GUI has evolved and in some ways has come full circle. If you are less than 25 years old, there's a very good chance you've never used a computer that didn't have a graphical user interface. But at the time, it was a radical departure from the way in which most people interacted with a computer. ]]></description>
										<content:encoded><![CDATA[<p>30 years ago this past January, Apple launched the Macintosh and with it, the first widely available computer with a Graphical User Interface or GUI. If you are less than 25 years old, there&#8217;s a very good chance you&#8217;ve never used a computer that didn&#8217;t have a graphical user interface. But at the time, it was a radical departure from the way in which most people interacted with a computer. <span style="line-height: 1.62;">Over the past 30 years, the GUI has evolved and in some ways has come full circle.</span></p>
<p><span id="more-141"></span><strong>In the Beginning</strong></p>
<p>In 1984 when the original Macintosh was introduced, as advanced as its interface was at the time, its graphics capabilities were extremely weak, at least by today&#8217;s standards. It could only manage a black and white, 1 bit interface.</p>
<p><a href="http://www.xojo.com/blog/en/assets_c/2014/07/Original%20Mac%20User%20Interface-444.php"><img decoding="async" class="mt-image-center" style="text-align: center; display: block; margin: 0px auto 20px; width: 400px;" title="1984_mac.png" src="https://blog.xojo.com/wp-content/uploads/2014/07/1984_mac.pngt1466486449161ampwidth400" sizes="(max-width: 400px) 100vw, 400px" alt="1984_mac.png" width="400" data-constrained="true" /></a>Almost 2 years later, Microsoft Windows 1.0 was released and it added color but was laboring under many of the same technological restrictions.</p>
<p><a href="http://www.xojo.com/blog/en/assets_c/2014/07/Windows1.0-447.php"><img decoding="async" class="mt-image-center" style="text-align: center; display: block; margin: 0px auto 20px; width: 400px;" title="windows_1.0.png" src="https://blog.xojo.com/wp-content/uploads/2014/07/windows_1.0.pngt1466486449161ampwidth400" sizes="(max-width: 400px) 100vw, 400px" alt="windows_1.0.png" width="400" data-constrained="true" /></a>The whole idea behind the GUI was to make the interface use elements that were similar to things the user already understood. For example, pushbuttons already existed in the real world. The idea of a window as a place to view things was an analogy but it connected the user with something they already understood.</p>
<p>Over time, the graphics capabilities of personal computers became stronger. Processors, memory and displays all became better, faster and less expensive. Consider Windows 95 (released in 1995) and Mac OS 8 (released in 1997):</p>
<p><a href="http://www.xojo.com/blog/en/assets_c/2014/07/Windows%2095-450.php"><img decoding="async" class="mt-image-center" style="text-align: center; display: block; margin: 0px auto 20px; width: 320px;" title="windows_95.png" src="https://blog.xojo.com/wp-content/uploads/2014/07/windows_95.pngt1466486449161ampwidth320" sizes="(max-width: 320px) 100vw, 320px" alt="windows_95.png" width="320" data-constrained="true" /></a><a href="http://www.xojo.com/blog/en/assets_c/2014/07/Mac_OS_8-453.php"><img decoding="async" class="mt-image-center" style="text-align: center; display: block; margin: 0px auto 20px; width: 320px;" title="mac_os_8.png" src="https://blog.xojo.com/wp-content/uploads/2014/07/mac_os_8.pngt1466486449161ampwidth320" sizes="(max-width: 320px) 100vw, 320px" alt="mac_os_8.png" width="320" data-constrained="true" /></a>Both took a big leap forward. With better graphics and color, the user interface felt more realistic. I know that may be hard to imagine if you never used these versions of Mac OS or Windows but remember and consider what users had been using up to this point. Over time, user interfaces became more skeuomorphic, though this was certainly true more for OS X than it was for Windows. Apple made the Address Book and Calendar   look somewhat like their real world counterparts.</p>
<h4>The Mobile User Experience</h4>
<p>When the iPhone came along, it took the mobile interface to an entirely new level. One of the things Apple realized is that when using an interface on a much smaller screen, you have to simplify. The busier the user interface, the harder it is to use. This meant fewer lines and simpler graphics. It also meant being more choosy about the functions apps would perform because of the limited space available in such a small screen (among other things). The simpler user interface allowed the user to focus.</p>
<p>Android has gone through the same transition over the years, from Cupcake yo KitKat.</p>
<p><strong>The Changing Desktop</strong></p>
<p>In some ways, this didn&#8217;t start with iPhone and Android. For example, for years many graphics and photo-editing applications have used a dark theme so that the application&#8217;s interface fades into the background helping the user focus on their content.</p>
<p><a href="http://www.xojo.com/blog/en/assets_c/2014/07/Photoshop-462.php"><img decoding="async" class="mt-image-center" style="text-align: center; display: block; margin: 0px auto 20px; width: 400px;" title="photoshop_screenshot.png" src="https://blog.xojo.com/wp-content/uploads/2014/07/photoshop_screenshot.pngt1466486449161ampwidth400" sizes="(max-width: 400px) 100vw, 400px" alt="photoshop_screenshot.png" width="400" data-constrained="true" /></a> Microsoft Windows 8 offered a radically different user interface. While the change has certainly been contraversial, it&#8217;s clearly meant to simplify the desktop user experience. Microsoft&#8217;s motivation was to provide a single user experience for desktop and mobile. It&#8217;s clear now that this was not a complete success and it seems they are going to make a new attempt with Windows 9.</p>
<p>Now Apple, with the upcoming release of OS X 10.10 Yosemite, is simplifying the user interface as well. You can click to enlarge the picture below.</p>
<p><a href="http://www.xojo.com/blog/en/assets_c/2014/07/OS%20X%20Yosemite-477.php"><img fetchpriority="high" decoding="async" class="mt-image-center" style="text-align: center; display: block; margin: 0 auto 20px;" src="https://blog.xojo.com/wp-content/uploads/2014/07/os20x20yosemite-thumb-400x234-477.jpgt1466486449161ampwidth504ampheight295" sizes="(max-width: 504px) 100vw, 504px" alt="OS X Yosemite.jpg" width="504" height="295" /></a></p>
<p>But in Apple&#8217;s case, I believe the motivation is more about simplifying for two reasons:</p>
<p>1) The simpler the visual aspects are, the more the user can focus on their content. Apple is not removing functionality. The change is more about how they render the user interface itself. There are fewer lines and fewer colors.</p>
<p>2) While Apple firmly (and rightly, IMHO) believes that you can&#8217;t have the exact same user experience on a mobile device as a desktop, there are visual cues that can be shared which make it easier for the user to switch between devices- icons for sharing, etc. Switching between desktop and mobile devices is a behavior that is very common and only becoming more so.</p>
<h4>Coming Full Circle</h4>
<p>In some ways we have come full circle. We started with simple user interfaces that were skeuomorphic and became even more so over time. At the time, they had to be because the user needed to relate the user interface to the real world. The Graphical User Interface has been around long enough now that it no longer needs to be skeuomorphic. Today, user interfaces need to make sense to users who already have experience with a graphical user interface. The simplified user interface makes the user experience more friendly and allows the user to stay more focused on what they are trying to accomplish. In subtle ways it increases productivity which, after all, is why we use computing devices in the first place.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
