This file is indexed.

/usr/share/php/tests/Horde_Feed/Horde/Feed/fixtures/lexicon/http-www.lastcraft.com-blog-wp-rss2.xml is in php-horde-feed 2.0.1-4.

This file is owned by root:root, with mode 0o644.

The actual contents of the file can be viewed below.

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
<?xml version="1.0" encoding="utf-8"?>
<!-- generator="wordpress/1.2" -->
<rss version="2.0" 
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
>

<channel>
	<title>The Last Craft?</title>
	<link>http://www.lastcraft.com/blog</link>
	<description>Programs are written by people</description>
	<copyright>Copyright 2005</copyright>
	<pubDate>Wed, 23 Nov 2005 23:00:17 +0000</pubDate>
	<generator>http://wordpress.org/?v=1.2</generator>

		<item>
		<title>Sarbanes-Oxley versus Agile</title>
		<link>http://www.lastcraft.com/blog/index.php?p=20</link>
		<comments>http://www.lastcraft.com/blog/index.php?p=20#comments</comments>
		<pubDate>Mon, 21 Nov 2005 23:37:48 +0000</pubDate>
		
	<category>Programming</category>		<guid>http://www.lastcraft.com/blog/index.php?p=20</guid>
		<description>What's Enron got to do with software development? If you work for a big company, quite a lot... </description>
		<content:encoded><![CDATA[	<p>Agile development assumes that the participents are professionals. We take it for granted that everyone will do the best for the team, that everyone is skilled and capable, and that the team is focused on the project being successful for the sponser. It&#8217;s the assumption of doing the right thing that allows us to skimp on the paperwork and excessive formality. Trust is presumed.</p>
	<p>Then came Enron.</p>
	<p>OK, so there is no trust to be had when it comes to large sums of money, but we developers don&#8217;t make financial decisions. What&#8217;s Enron got to do with us?</p>
	<p>Well, quite a lot as it turns out. You see, the defence of main culprits was ignorance. They &#8220;innocently&#8221; lost control of the finances and didn&#8217;t know what was going on. Negligent leadership yes, criminal act no, they pleaded. It&#8217;s to remove that defence that the US govenment introduced the <a href="http://www.aicpa.org/info/sarbanes_oxley_summary.htm">Sarbanes-Oxley legislation</a>. A financial officer accidently losing control of the finances is no longer a satisfactory excuse for staying out of jail. Not only that, but auditors have the duty and right to report any loss of financial control to the shareholders, and for a public company to the public, to give early warning. That&#8217;s not a small detail. An international bank, say, that recieved a bad audit on this score would not welcome the publicity. After Enron, it gives the market the jitters.</p>
	<p>So what? This is a programming blog, so stick to the point Marcus. OK, I will. The point is that the loss of trust extends all the way down to our daily working practices. Some examples, the obvious one first.</p>
	<p>Development time costs money. That money should be tied back to a specific project or task. Free floating development costs are a black hole, and black holes are no longer acceptable. You can emply some kind of standard accounting tool to keep track of this, or you can invent your own. Trouble is, if you invent your own you are subject to an audit every year. Accountants probably won&#8217;t be very impressed with a homebrew system that works on index cards, and so using some kind of standard will save a lot of trouble. The change management tool vendors love this of course. Tying version control check-ins to requirements tools is the kind of stuff that cash cows are made of. Tool vendors are spreading the word.</p>
	<p>Now if you are very lucky, you have an enlightened project manager who understands the value of refactoring. You see a problem unrelated to your current work, you fix it right then. Everyone wins&#8230;er&#8230;except you. You have to shoehorn this piece of opportunism through the change management tool. From the demonstrations I have seen so far, these tools don&#8217;t look too agile.</p>
	<p>Suppose you are writing some simple database code that generates a report. Not much need to refactor here, this has been done hundreds of times. Well, the data in this report probably gets used in the performance indicators of the company, the profit and loss figures upon which the big decisions are made. If there is suspicion of that data, the company is losing control of it&#8217;s finances. It could be decieving the shareholders too. This is not about testing, it&#8217;s about having the authority to work on that report. The software as well as the data must be secure and no one person may hold all the keys. Your handling of passwords becomes subject to external policy. Opening the version control system to <a href="http://www.extremeprogramming.org/map/code.html">shared code ownership</a> may not be part of that policy. That&#8217;s a big loss for an agile team.</p>
	<p>None of these problems are insurmountable of course. A dash of politics, some technical adjustments and a dose of guerilla refactoring may get you through the day, but it&#8217;s still friction. Fear spreads.
</p>
]]></content:encoded>
		<wfw:commentRSS>http://www.lastcraft.com/blog/wp-commentsrss2.php?p=20</wfw:commentRSS>
	</item>
		<item>
		<title>Listen kids, AJAX is not cool</title>
		<link>http://www.lastcraft.com/blog/index.php?p=19</link>
		<comments>http://www.lastcraft.com/blog/index.php?p=19#comments</comments>
		<pubDate>Fri, 03 Jun 2005 04:11:45 +0000</pubDate>
		
	<category>Programming</category>		<guid>http://www.lastcraft.com/blog/index.php?p=19</guid>
		<description>I cannot think of a worse collision of technologies than low level user interfaces and AJAX. Avoiding the wreckage will involve new learning and a lot of thinking. </description>
		<content:encoded><![CDATA[	<p>If you writing a user interface, <a href="http://www.asktog.com/basics/03Performance.html">make sure it responds in 1/10th of a second</a>. That&#8217;s a pretty simple rule, and if you break it, you will distract the user. This rule has pretty much become law, never mind lore. You find it in books such as &#8220;The Humane Interface&#8221; by <a href="http://jef.raskincenter.org/home/">Jef Raskin</a> and many other user interface guides. If you write GUI software, you are well aware of it.</p>
	<p>I cannot find this rule anywhere in Jacob Nielson&#8217;s &#8220;Designing Web Usability". It&#8217;s not a bad book, in fact it&#8217;s an excellent one. It&#8217;s just that web interfaces are a different usability problem, one of organising fairly static information. You don&#8217;t need stopwatches or video cameras to study users in this environment. Just five test subjects and perhaps the Apache logs. This is stuff anyone can do. You only have to state that all links have to be <a href="http://www.asktog.com/columns/047HowToWriteAReport.html">underlined blue to start out</a> in this community. Until now, the low level neurology has been left to the writers of web browsers. The two communities have been separated.</p>
	<p>Except now we have <a href="http://www.adaptivepath.com/publications/essays/archives/000385.php">AJAX</a>.</p>
	<p>AJAX is JavaScript based and JavaScript is usually used to add convenience and to pretty up web pages. Because sites had to work without JavaScript, usage was limited to extending the HTML or saving on the odd page request. Then came <a href="http://gmail.google.com/">GMail</a>. Suddenly we have lot&#8217;s of web developers &#8220;enhancing&#8221; the browser experience with behind the scenes XML fetching back to the original site. I cannot think of a worse collision of technologies than low level user interfaces with requests over the internet. The delays and failures of internet traffic are especially painful in this environment and, from the AJAX demos I&#8217;ve seen, the developers aren&#8217;t helping.</p>
	<p>A typical demo is form validation. The first field is usually one where the user can select a new user name. Of course that username could be taken, and so this initiates an AJAX request to the server. Meanwhile I have carried on typing and am a few fields in when a dialog pops up. You don&#8217;t need a <a href="http://www.eecs.umich.edu/~kieras/goms.html">GOMS</a> analysis to know that this is going to be extremly annoying. I dismiss the dialog, it said something about a database I think, and retype the second half of my word. I go to submit the form and find the submit button is greyed out. I eventually work out that the username has been cleared and I retype it and quickly click submit. Oh joy.</p>
	<p>OK, this is a badly designed example and it could be improved in several ways. For example, within 1/10th of a second, I could highlight the field in some way. Maybe I could grey out the text or highlight the border of the field with a pale yellow. When the response comes back I could highlight the border as red on failure and <strong>only then</strong> disable the submit button. The highlighted field had better not have scrolled off the screen and I had better have a helpful message next to it by then. If the user can type faster than my server can respond, likely if the user is habituated to a form, then they should be allowed to submit. Otherwise habituation is lost and the interface starts invading their short term memory.</p>
	<p>Even when the process is improved, there is no guarantee that we have enhanced the user experience. Entering a form is a familiar operation. We can do it whilst answering the phone or explaining something to the kids. The extra time fetching a separate error page may be time that I am putting to other uses anyway. Perhaps I just needed a rest. An interface that jars me out of my familiar path is probably not helping me at all. I don&#8217;t know this of course, because I haven&#8217;t measured it. But if you design such an interface, you don&#8217;t know either. You need to measure it, and this kind of usability is a lot more work than watching people click on pages.</p>
	<p>I read more web pages than I do books and I spend more time doing it than I watch TV. I don&#8217;t think I am alone in being habituated to the way the web behaves as pages. When you write AJAX applicatons you drive a horse and cart through one of the most successful metaphors of all time. GMail can get away with it, because it&#8217;s very close to a related metaphor, the mail application. Being the new way has a price.</p>
	<p>AJAX does have some uses. If you are exploring a dataset, you don&#8217;t want to fetch the core data again on every attempt to expand just a portion of the information. I have seen an excellent demo by a colleague with a trading system. Using a rollover it is possible to see trading history for each market indicator. Because this is an intranet system, it is responsive enough, and because the information is embedded in other explanatory content, it makes sense to use a web interface. That demo was cool, but it was a pet project not a deployed application. These put AJAX on collision course with another issue in software development, automated tests.</p>
	<p>We have it easy as web developers. We just shovel text around and text is easy to test. Unsurprisingly there are a lot of <a href="http://www.softwareqatest.com/qatweb1.html#FUNC">tools to test web content</a>. Tesing GUIs is far harder, so hard that it isn&#8217;t usually attempted. Instead a thin presentation layer is written and the calls to it are intercepted. The presentation layer still has to be tested by hand and that will delay a rollout. GUI applications are usually shrinkwrapped items, so that&#8217;s no problem for them. A web server may get rolled out twice a day. That&#8217;s a big problem for us. </p>
	<p>More advanced tools may help a little here. <a href="http://selenium.thoughtworks.com/index.html">Selenium</a> and <a href="http://www.clabs.org/wtr/index.cgi?page=/WaTiR">WaTiR</a> at last make JavaScript testing possible, but it&#8217;s not easy to set up for integrated testing and you still need a browser. I haven&#8217;t yet seen an AJAX demo tested with Selenium. If anyone tries it, I&#8217;d like to know.</p>
	<p>AJAX has possibilities, but it&#8217;s not there yet. Not as a community and not with the tools. Web developers cannot become GUI developers overnight. We need time.
</p>
]]></content:encoded>
		<wfw:commentRSS>http://www.lastcraft.com/blog/wp-commentsrss2.php?p=19</wfw:commentRSS>
	</item>
		<item>
		<title>How did Google get it wrong?</title>
		<link>http://www.lastcraft.com/blog/index.php?p=18</link>
		<comments>http://www.lastcraft.com/blog/index.php?p=18#comments</comments>
		<pubDate>Wed, 09 Feb 2005 02:15:20 +0000</pubDate>
		
	<category>Programming</category>		<guid>http://www.lastcraft.com/blog/index.php?p=18</guid>
		<description>If you run a blog or Wiki you will be only too aware of the Google PageRank(tm) sytem. So Google have decided to help the bloggers? Er...no. </description>
		<content:encoded><![CDATA[	<p>If you run a blog or Wiki you will be only too aware of the Google PageRank&#8482; sytem. In case you have been on a rather extended holiday and/or in a long coma, it&#8217;s a system whereby your site climbs the search engine results page if lot&#8217;s of other people are linking to you. It&#8217;s not quite that simple, but that&#8217;s the gist. In competition with each other to promote sales of Viagra, or to get people hooked on gambling, various crooked characters deface public sites with gay abandon. They leave a trail of links pointing at their own sites, often with Chinese titles, all to boost their own PageRank. All to climb Google.</p>
	<p>These comment spammers are not nice people.</p>
	<p>They will happily destroy the content of a Wiki and overwrite every page. If they don&#8217;t get every one, then it&#8217;s usually because their script is too stupid to keep track of the pages it&#8217;s already written over and so cannot get to the now newly orphaned pages. These scripts hammer the site while they operate. Not only that, but the frequency of attacks is now at epidemic proportions. I get about three separate attacks a day on my blog and about five major attacks a day on <a href="http://www.phplondon.org/wiki/">this PageRank 7 wiki</a>. Faced with the brutality and increasing frequency of these incursions, ISPs can take down servers believing that are under a denial of service attack. Even if they understand the phenomena, such attacks cause too much server load for the value of having the small blog customer. ISPs are starting to ban the use of tools like WordPress and MoveableType on their end user accounts.</p>
	<p>OK, it&#8217;s not just Google to blame here, but all of the search engines. It&#8217;s just that Google&#8217;s system is the most well known and this has historically made it the main spam target. In a tacit acknowledgement of this, <a href="http://www.google.com/googleblog/2005/01/preventing-comment-spam.html">Google have decided to help the bloggers</a>. Er&#8230;sort of.</p>
	<p>Their solution is to allow you to take away the PageRank value of selected links. If you are maintaining Wiki/blog software then comment field links should have a &#8220;rel&#8221; attribute (uh?) set to &#8220;nofollow". That way the spammers will lose the incentive to spam you, because they will get no benefit from the links they leave. &#8220;Drat&#8221; they say, as the abandon their get rich quick scheme and go off to earn an honest wage.</p>
	<p>The plan is so idiotic it&#8217;s almost surreal. It obviously in no way penalises the spammers, who are playing a percentage game anyway. So what if a few spams are ploughed into stoney ground? It does make the engine spider&#8217;s life a little easier of course, because it can spend less time indexing blogs. Lucky old engines, poor old webmasters who are expected to upgrade all of their software. Software that has been heavily customised and, given that few of these applications are design masterpieces, heavily hacked. I certainly won&#8217;t be upgrading when there is zero benefit. Even if I do, the new attribute has to survive RSS feeds and some old and not so smart news aggregators. Really I won&#8217;t have time anyway because I am too busy fighting spam.</p>
	<p>What&#8217;s even more surreal though, is that the software authors are jumping on board and working on adding this as a feature. There is even talk of making it part of the HTML standard. This attribute is about as useful as the blink tag.</p>
	<p>Suppose the engines had tackled it differently. Suppose that when your site was spammed, you could dispatch the content of the spam straight to Google, Yahoo, etc. They could then ban all of the links promoted with the dubious posting. A sort of &#8220;SpamBack". This changes the market forces significantly from the peddlars point of view. Far from ploughing on less furtile ground, they are now ploughing a mindfield. Rather than one hundred percent of everybody having to manually instruct the GoogleBot, all it would take would be a small percentage of spam aware applications to fight back. The spammers could not risk dumb spamming for fear of tripping these alarms.</p>
	<p>I bet there are other simple solutions as well.</p>
	<p>So how did Google get it wrong? There are smart people in Google, so did they not allocate enough time to this? Perhaps they lost touch? Can you see blogger peons working in a lowly office from the hallowed windows of a &#8220;plex"? Perhaps the Google blog could explain as it&#8217;s hardly a public relations coup. <a href="http://www.nonofollow.net/">Whole sites have sprung up against &#8220;nofollow&#8221;</a>.
</p>
]]></content:encoded>
		<wfw:commentRSS>http://www.lastcraft.com/blog/wp-commentsrss2.php?p=18</wfw:commentRSS>
	</item>
		<item>
		<title>CEOs are chickens</title>
		<link>http://www.lastcraft.com/blog/index.php?p=17</link>
		<comments>http://www.lastcraft.com/blog/index.php?p=17#comments</comments>
		<pubDate>Sat, 27 Nov 2004 05:46:59 +0000</pubDate>
		
	<category>Programming</category>		<guid>http://www.lastcraft.com/blog/index.php?p=17</guid>
		<description>The Scrum meeting rule says that pigs, committed project members, are allowed to talk. Chickens, people who's career would be unscarred by project failure, can only listen. In a small web firm, besides the web developers, who are the pigs?
 </description>
		<content:encoded><![CDATA[	<p>The following analogy comes from <a href="http://www.controlchaos.com/">Scrum</a>. In fact I am going to quote Ken Schwaber and Mike Beedle&#8217;s &#8220;Agile Software Development with Scrum"&#8230;</p>
	<blockquote><p>
A chicken and a pig are together when the chicken says, &#8220;Let&#8217;s start a restaurant!&#8221; The pig thinks it over and says &#8220;What would we call this restaurant?&#8221; The chicken says &#8220;Ham n&#8217; Eggs!&#8221; The pig says, &#8220;No thanks. I&#8217;d be committed, but you&#8217;d only be involved!&#8221;
</p></blockquote>
	<p>The Scrum meeting rule says that pigs, committed project members, are allowed to talk. Chickens, people who&#8217;s career would be unscarred by project failure, can only listen. Opinions from pigs will have the needs of the project at the top of their concerns. They cannot afford to put self interest first, leading to balanced and rational compromise. So in a small web based firm, besides the web developers, who are the pigs?</p>
	<p>The sales managers are usually piggies. They have sales targets, so usability, customer profiling and conversion rates are vital to them. Missing those targets is financially limiting, and possibly career limiting too.</p>
	<p>Marketing are also in the pig pen. They will need a constant stream of information from the developers, usually in the form of processed log files. They also need to post process content for search engine optimisation and usually have link building programs in play. If the developers cannot supply these services then the marketing plan can be severely disrupted. Even if marketing&#8217;s jobs are safe, someone&#8217;s head will eventually roll.</p>
	<p>Another porker is the content manager. An unpublished author has achieved nothing and will complain loudly. There may have been expensive copyright negotiations beforehand that won&#8217;t repay themselves until publication. Also content ages. A delayed appearance on the web site could invalidate it. If the content manager has a problem, the development team will hear about it in about the time it takes to walk down the corridoor.</p>
	<p>The support staff slopping around in the mud are utterly dependent on IT. They can have a miserable job, so let&#8217;s not make it worse for them.</p>
	<p>By contrast the CEO can take project scale action. That action could be as drastic to fire everyone responsible and outsource the whole project to India, and they might do it anyway if it&#8217;s perceived to be in the interest of the company. More usually intervention comes in the form of long term strategy changes that affect the other stakeholders. The mission statement of the project will shift accordingly, or the project may fragment or be allowed fewer resources. The original plans can be changed to the point of mutilation by them. The CEO is committed to the company, not the project.</p>
	<p>If you are using iterative development then you have supplied your CEO with <a href="http://abc.truemesh.com/">options</a> other than cancellation or expensive change requests. The CEO already has sufficient influence over the warring parties that there is no need for them to write stories or micromanage priorities.</p>
	<p>So, can you keep the CEO from interfering in iteration meetings? Good cluck&#8230;
</p>
]]></content:encoded>
		<wfw:commentRSS>http://www.lastcraft.com/blog/wp-commentsrss2.php?p=17</wfw:commentRSS>
	</item>
		<item>
		<title>No URLs in my blog</title>
		<link>http://www.lastcraft.com/blog/index.php?p=15</link>
		<comments>http://www.lastcraft.com/blog/index.php?p=15#comments</comments>
		<pubDate>Wed, 27 Oct 2004 03:02:35 +0000</pubDate>
		
	<category>Programming</category>		<guid>http://www.lastcraft.com/blog/index.php?p=15</guid>
		<description>Loose coupling in my blog. Are URLs dead? </description>
		<content:encoded><![CDATA[	<p>It&#8217;s an experiment. I want to see if URLs are dying.</p>
	<p>The idea is that the search engines are now so good, there is no need for a hard link to another site I have no control over. I&#8217;m a programmer and so I want loose coupling from my blog. Link rot is a good example of a dangling pointer to me and so I am refactoring back to English descriptions. Back to a query rather than a reference if you like. The plan is given a sufficient keyphrases you will almost certainly find the same reading material I based the blog entry on. Not exactly the same maybe, but blogging is about news and ideas rather than specific documents so I have room for flexibility.</p>
	<p>There are going to be problems with this. The first is that the blog entries will be less eye catching. Web users start with a title and then start looking for blue underlined text. Well I don&#8217;t have any, just a wall of black and white, so the posts look rather boring. Hopefully I can make up for it with controversial titles that annoy people into reading on.</p>
	<p>A more serious problem is that site impaired users are more dependent on significant text than sighted users. It&#8217;s common for screen reader users to use the tab key to cycle through titles and links. With less tab hits on the content, but still the same number on the navigation, I am reducing the signal to noise ratio for these users when they browse.</p>
	<p>Another slim possibility is that I may know a subject better than my reader and supply insufficient context to the subject without realising. Well if that happens then probably I have lost the plot on the whole posting anyway. I don&#8217;t even want to think about that scenario.</p>
	<p>One positive benefit is that I plan to refer to books by full title and author. That means you can choose which online bookshop you prefer to use. If you are in the Amazon camp and feel your obligitory quick fix is missing, then you can still select the text and use the A9 metacrawler and arrive almost as quickly. Little is lost I think and I am not usually sympathetic to majorities. You have it too easy.</p>
	<p>Of course all of this saves me time as well, so it&#8217;s also an experiment in lean blogging. Tracking down URLs and making sure they are the permanent ones, not temporary front pages, is as much effort as an extra paragraph. For something that is only a half dozen paragraphs, that&#8217;s a lot of overhead.</p>
	<p>It would be nice if links could be created in HTML that just contained keywords or just bounded text. You could set your favourite engine into your browser, or be stuck with MSN if you were using IE, and have the browser submit the query. The browser could even pop up a menu of results as you hovered the mouse over. Handy as I hate having to leave the page I&#8217;m on when following an article, but want to look up a reference. All that would be necessary then would be for web content writers to mark text as significant. <b>Perhaps the bold tag would do as a temporary measure.</b></p>
	<p>Anyway, adding anchors seems too inflexible these days. I&#8217;m also rather lazy. Actually that&#8217;s the real reason.
</p>
]]></content:encoded>
		<wfw:commentRSS>http://www.lastcraft.com/blog/wp-commentsrss2.php?p=15</wfw:commentRSS>
	</item>
		<item>
		<title>assert All Swans Are White()</title>
		<link>http://www.lastcraft.com/blog/index.php?p=12</link>
		<comments>http://www.lastcraft.com/blog/index.php?p=12#comments</comments>
		<pubDate>Mon, 11 Oct 2004 06:06:15 +0000</pubDate>
		
	<category>Programming</category>		<guid>http://www.lastcraft.com/blog/index.php?p=12</guid>
		<description>You never know absolutely for sure that your project is working correctly and you never will, but you can exude confidence. </description>
		<content:encoded><![CDATA[	<p>You cannot prove anything by testing. No matter how hard you try, there could always be something wrong, or some combination of conditions or some external event that you haven&#8217;t thought of. You never know absolutely for sure that your project is working correctly and you never will.</p>
	<p>The nature of proof is an old issue and one that was faced by the scientific community in it&#8217;s debate with religion. We are scientific, the scientists would say, because we do experiments to prove things. Aha, the religious leaders would say, you ain&#8217;t proving anything as long as there is another experiment you can do. Another experiment is another unknown and if something is unknown it isn&#8217;t proven. Therefore you are a religion too because your conviction is based on faith in your unproven theories, Q.E.D. For the scientists this must have smarted a bit, but fortunately Karl Popper came to the rescue in 1934 with his book &#8220;The Logic of Scientific Discovery". Here is his illustration that separates the two camps&#8230;</p>
	<p>You live in a village and all you have seen today are white swans, from which you conclude all swans are white. A little rash perhaps, but logical. Now this isn&#8217;t yet a convincing theory, so we wander down to the village pond and look for swans. We are not looking for white swans though, we are looking for non-white ones. This assymetry is important, because although more white swans advance our case only a little, a single black swan will kill it stone dead. Because the threat to our theory is so devastating, every time it survives it increases our confidence. If we want to pursuade the world that our theory is correct we want to attack it as often as possible. Ideally we will search high and low for black swans, but fail to find any. We want to test the theory in as many novel ways as possible, and with notoriety others will test it too and they will think differently from us. And so collectively we never prove it, but we do get ever more confident.</p>
	<p>For this to work our theory must be disprovable. The counter theory that there exists in the universe at least one black swan is not provable, and so not scientific. Note that scientists don&#8217;t have to be scientific, only the theories. Actually it helps if they are as mad as hatters, because that way we get a greater variety of testing. This process also allows us to have a single scientific truth. If two theories differ by prediction then we can determine which is correct by experiment. If they don&#8217;t then we hack away with Occam&#8217;s razor until the theories are identical anyway.</p>
	<p>Back to the code. We had a theory that it isn&#8217;t broken.</p>
	<p>Life is a little simpler for developers because we are not usually dealing with an infinite black box. It&#8217;s as if we could see the cogs of a small part of mother nature laid out before us and we are just checking the workmanship. This gives us an alternative approach. If the code is simple enough then we can completely understand it and won&#8217;t have any bugs. The whole project will probably never be in that state, but small sections of code will. The extra clarity we get doesn&#8217;t just give us our first theories of how the code behaves, but also allows us to eliminate vast swathes of possible tests as too trivial to bother with.</p>
	<p>These areas of understanding are also fed by the tests themselves. I think we hop between theory and understanding in short cycles, and have a mix at any one time. Our areas of complete understanding are temporarily demolished during refactoring and are reduced to theories that our code still works as before. During and just after these transitions we add tests to resolve conflicting models in our heads and, with further change, turn unclear parts into areas of understanding again. I think this gets to the heart of testing with refactoring. Because the tests act as a cushion that allows us to drop back to theorising, they allow us to leap into the unknown. They are an agent of change.</p>
	<p>On the project scale we haven&#8217;t a hope, but more people help. Inspection and pairing help produce more understandable and more fully understood code. For those parts that are just assumed working, more varied people with an antagonistic attitude will shore up those assumptions with novel and challenging tests. As many as resources allow, so perhaps the many eyeballs theory has some merit.</p>
	<p>We still haven&#8217;t proved it&#8217;s working of course, but we can exude as much confidence as we want.
</p>
]]></content:encoded>
		<wfw:commentRSS>http://www.lastcraft.com/blog/wp-commentsrss2.php?p=12</wfw:commentRSS>
	</item>
		<item>
		<title>Install me</title>
		<link>http://www.lastcraft.com/blog/index.php?p=11</link>
		<comments>http://www.lastcraft.com/blog/index.php?p=11#comments</comments>
		<pubDate>Sat, 02 Oct 2004 21:50:40 +0000</pubDate>
		
	<category>Programming</category>		<guid>http://www.lastcraft.com/blog/index.php?p=11</guid>
		<description>We all constantly perform cost benefit analysis at everything we do, and if your project drops below our threshold for even a second we will ditch it and go on to something else. Instant gratification is best. </description>
		<content:encoded><![CDATA[	<p>I am not the slightest bit interested in your program.</p>
	<p>I am surrounded by problems and have a to-do list as long as my arm. The only reason I am at your web site right now is because I have heard an unlikely rumour that one of my problems will be completely eliminated by your software. It is going to positively leap out of the computer and start resolving issues while I put my feet up and start to enjoy life. At least that&#8217;s what I&#8217;ve heard. You&#8217;ll forgive me if I&#8217;m sceptical.</p>
	<p>First impressions mean a lot. We hate to believe this, but it&#8217;s true. When I used to teach I would find that the tone of the lesson was set within the first five minutes. The tone of the first five minutes would be set by how the children entered the classroom and the tone of that would be set by how I greeted them in the corridoor. It&#8217;s difficult to turn things around after a bad start.</p>
	<p>My first contact with your software is likely the web site with the download link. If the eyeball tracking studies are correct, I will read the title first and then start scanning for blue underlined text. I am already looking for the link marked &#8220;download now". As an aside, if I arrived at this page with a Linux browser from a UK IP, chances are I would like the Linux version from a European mirror, so please don&#8217;t ask. Assuming the file dialog opens straight away I can consign the thing to my home download folder and carry on reading your project landing page. This is where the fun begins.</p>
	<p>You have to hold my hand until the benefits of your project are obvious enough to warrent self study and experimentation, and I&#8217;m an unenthusiastic slow learner. We all constantly perform cost benefit analysis at everything we do, and if your project drops below my threshold for even a second I will ditch it and go on to something else. Instant gratification is best.</p>
	<p>The first and most difficult hurdle is clicking &#8220;install". Don&#8217;t think that&#8217;s much of a problem? Go to your personal download folder now and have a look around. Full of tar and zip files right? What percentage of those have you unpacked? Of those, how many have you installed? If you are anything like me, likely a third at most are doing more than acting as hard drive filler, and yet all I had to do was two clicks. If your landing page has a long list of install instructions I will even click the browser cancel button right now. The thought of any extra work is just too frightening.</p>
	<p>I may want doorstep convenience, but I don&#8217;t want you entering my house uninvited. Before you perform any install operation I would like to know exactly where you are putting stuff. It&#8217;s my computer and I like to keep it tidy when I can. I also want to be able to remove your program the instant I am disenchanted with it, and if I don&#8217;t think that&#8217;s possible I won&#8217;t install in the first place. My machine is stable right now and I want to keep it that way.</p>
	<p>If your program is GUI based then I&#8217;ll run it now. I want to do something straight away and I want to see a result. Wizards don&#8217;t help, because they do stuff that I don&#8217;t understand anyway. Chances are I want to read a file, or write a very simple one. I don&#8217;t want to create projects, import directories or fill in loads of personal preferences. Once I know that your software is working I will start on the tutorial.</p>
	<p>If your software is a programmer library then things are actually easier. I am going to carry on reading your web page and will read the &#8220;quick start&#8221; guide. I am going to follow the instructions on your page to the letter and I am not going to engage my brain in any way at all. I want to see the equivalent of &#8220;Hello world&#8221; in five lines of code or less with exactly the output described by your website. No big XML configuration files or templates to fill out, just a single script. Remember I have also downloaded your rival&#8217;s framework. You know, the one who always claims that his version is better than yours in the newsgroups? If everything seems to be working I&#8217;ll start on the tutorial.</p>
	<p>There is a tutorial isn&#8217;t there? One that talks to me at a level and in language I can understand?</p>
	<p>And if the tutorial starts to tell me how to solve my problem I&#8217;ll cheer up a bit. Once I am reading about the things I can now do it starts to get interesting, even fun. I&#8217;ll lean back and sip my tea - did I mention I was from the UK? - and I&#8217;ll play with your examples and learn to use your creation to solve my problem. If it does I&#8217;ll definitely send you a thank you e-mail. I&#8217;ll even send you bug reports when it crashes and suggestions for new features. And when you tell me that the feature already exists I&#8217;ll kick myself for not reading your manual and apologise to you profusely. I&#8217;ll tell all my friends how great your software is too, even though I never did try that other one from your rival. And it all happened because you had the care to help me through my first tentative step.</p>
	<p>How could I ever have doubted you?
</p>
]]></content:encoded>
		<wfw:commentRSS>http://www.lastcraft.com/blog/wp-commentsrss2.php?p=11</wfw:commentRSS>
	</item>
		<item>
		<title>The best software book ever</title>
		<link>http://www.lastcraft.com/blog/index.php?p=9</link>
		<comments>http://www.lastcraft.com/blog/index.php?p=9#comments</comments>
		<pubDate>Tue, 28 Sep 2004 12:48:03 +0000</pubDate>
		
	<category>Programming</category>		<guid>http://www.lastcraft.com/blog/index.php?p=9</guid>
		<description>Books should change people. That's difficult with such an abstract and subtle topic as OO. There is one book better at this than any other. </description>
		<content:encoded><![CDATA[	<p>Books should change people. This one does.</p>
	<p>This is not an easy thing to do, but it gets even harder with mental distance between the author and reader. When you first learn something you are in an ideal position to explain it to someone on your own level. I think that&#8217;s why learning in small groups is so effective. The group is less likely to get stuck as explanations cross polinate the group. Once you are a few rungs further up the ladder, though, too much seems obvious. Your explanation will skip vital steps, or simply not give enough time for your pupil to take things in. Once you learn a topic a little more thoroughly, you do start to teach it better and bridge the divide. That&#8217;s not enough in itself though. It takes a higher level of care to get sufficiently under the skin of a subject not just to understand it, but to know where the cognitive difficulties are to.</p>
	<p>John Holt, in his classic book &#8220;How Children Fail", describes a lecture where an educationist professor is teaching maths to children who have had extreme learning difficulties. I mean really extreme, to the point of being &#8220;retarded". He conducted this math lesson with coloured rods, each colour corresponding to a particular length. The test he set the children was so trivial you probably wouldn&#8217;t think it a problem at all.</p>
	<p>Both himself and all the children had a tray of different coloured rods. He took two length seven rods and sandwiched between them a length four rod. One end was flush with the other two, leaving a length three gap. The problem he set them was to find this length three rod. For them that was no easy task, but the clever part was next. He turned the ensemble upside down and let the four length drop out. The next task? Find the rod that fits in the gap. Yes, that was all. I don&#8217;t want to spoil the ending, but I defy you to read it and not have a tear in your eye.</p>
	<p>Fellow programmers don&#8217;t usually have that much of a gulf between them, but they do have more complex hurdles. One that is common to just about every programmer is understanding object oriented programming. Not just knowing encapsulation, polymorphism and inheritance, but actually being able to write their first program with objects.</p>
	<p>You don&#8217;t have to trawl forums and newsgroups very much to realise that this is a popular topic. You see a constant stream of cries for help from people all at sea. They don&#8217;t know how to start. They worry that their code is not &#8220;reusable". They don&#8217;t understand that something they could have written a simple function for seems to be taking three times as much code with objects. Likely they end up with one big class, or lot&#8217;s of classes that don&#8217;t seem to do anything. Or lot&#8217;s of classes that really don&#8217;t do anything. And then they add something and it&#8217;s all not reusable at all. Some give up and go back to scripting and some of these proclaim that OO is all hype and no one else should bother either. </p>
	<p>With such a difficult and well known barrier to cross you would have thought there would be plenty of books to help. There are, but many inject as much fear as they do information. Firing off patterns is not enough you see, we have to explain it a rung or two lower. That&#8217;s difficult with such an abstract and subtle topic. I have seen only one book that does this.</p>
	<p>Explaining a rung or two lower is clever enough, but this book does more. It doesn&#8217;t just take the explanation down to a level that every developer can understand, it turns the subject into a puzzle. Puzzles are fun. You can try them one way and you can try them another and see what happens. That&#8217;s the secret of a good puzzle. You want the player to be able to see ahead, but not so far that they see the whole solution. What was once overwhelming, now becomes a wealth of possibilities. That takes away the fear and fear is bad for learning. Fun is what you need and also the confidence that the tools you are using are the same tools as the experts.</p>
	<p>You wouldn&#8217;t think at first that the book is to do with fun. A large part of it is a rather tedious catalogue and I doubt anyone has read it all the way through. Luckily, I am not measuring success by pages read, but by impact. I can fling this book at an up and coming developer and know that three weeks later our conversation will resume on an altogether higher plane. It works every time and that&#8217;s astonishing.</p>
	<p>That book is &#8220;Refactoring&#8221; by Martin Fowler. Someone give that man an OBE.</p>
]]></content:encoded>
		<wfw:commentRSS>http://www.lastcraft.com/blog/wp-commentsrss2.php?p=9</wfw:commentRSS>
	</item>
	</channel>
</rss>