Carbon based life form

Wednesday, November 6, 2013

How a terrorist is crafted

If you really want to do this properly imagine you are Casper the ghost. You can go any where and see everything. Also, you are dead so you no longer have to worry about religious preconceptions. Whatever you thought, or didn't think, has already been recorded against your name, so nothing really matters now.
Set aside your prejudices for a moment.

OK, background first. Israel annexed East Jerusalem in 1967. Now to build there you have to have a building permit (regardless of whether or not you have always lived there). Moving on ... the Jerusalem municipality granted the Arab sector 36,000 building permits, "more than enough to meet the needs of Arab residents through legal construction until 2020"

How many Arabs live in East Jerusalem? 260,800. But that is not bad, 7 per house. However, you have to be able to get the building permit. What to do if you can't get a permit (apparently it is very difficult)? An even more interesting question is, what to do if you can get a permit? With 40% unemployment among the Arabs only a select few would be able to afford to build.

So they do the best with what they have, and join with others and extend on what is already there.

So that has set the background for the rest of the play.

A kid goes to school and comes home to this


A couple of things to pick out of the picture. Lower left, see the guys with guns? Lower middle, see that red mattress? Do you think it belongs to a grownup? Notice the black water tanks on the roofs? They are water tanks, according to the Association for Civil Rights in Israel, 130,000 of East Jerusalem Arabs have no running water, but they pay taxes.

What is going to happen to that land? Plans to construct 25,000 Jewish homes in East Jerusalem are in the development stages

 So why am I writing this? I guess because she doesn't have an internet connection.
"A Palestinian child shouts to Israeli police, not pictured, while her family house is being demolished by the municipality in the east Jerusalem neighborhood of Beit Hanina"

Where will the family go now?




What will the kids do next?



That, is not the solution, and neither is this



However, I think it is the inevitable result. 

You may think I am being biased against the Jews. Just to set the record straight here is the stats on the terrorist attacks in Israel.


Why can't they get on? If you are thinking that it is because one side is evil and the other is not, you are no longer Casper, you are back to being human. Oh by the way, that is not always a good thing!

Saturday, June 27, 2009

Noah's ark

I came across this:
http://hanxter.wordpress.com/2008/10/08/working-replica-of-noahs-ark-opened-in-schagen-netherlands/
Some guy built a working replica of the ark according to the bibles description. Well, I would just like to pose a little problem:
The Australian megafauna died out in Australia about 20,000 years ago. The aborigines arrived about 60,000 years ago, so they actually co-existed here for a while. Simple logic; if the aborigines arrived here by boats from Indonesia and the megafauna were already here then something is wrong with the Ark story.
If the Ark story was true then after the flood the megafauna would have had to migrate back from the middle east to Australia, the problem is that the last time the continents were connected was roughly 200 million years ago.

So one of the following must be true,

  • They would have had to swim
  • They came with the aborigines
  • Noah existed 200 million years ago so the megafauna could walk home.
  • God just placed them here after the flood
  • Carbon dating and geological uranium dating are completely wrong.


Let's rule out the swimming as some animals can migrate across open sea by being stuck on a log, but these weren't your average marsupials they were huge.
Maybe the aborigines brought the megafauna with them by boat. However, one of them Megalinia prisca, was a 5.5m long carnivorous goanna weighing 600kg (1322 lbs). I'd like to see that in a fishing canoe lost at sea.
If Noah existed 200 million years ago then there are a lot of other questions that need answering, but none of the questions could really be taken seriously, so let's not go there.
God just placed them here. OK, but how did they get to the Ark in the first place? "And of every living thing of all flesh, two of every sort shalt THOU BRING into the ark" So Noah had to collect them! Well to give him credit at least he was 600 years old when it started to rain. How could 1 man and 3 sons collect every type of animal from Australia and take them back to the middle east? Not only that but he had to collect the food for them as well (Gen. 6:21). Now a Koala's diet is leaves from eucalyptus trees. So to take a Koala and it's diet all the way to the middle east and then have it survive for 40 days and then travel back again is impossible, the leaves would have dried out and the Koalas would have died. At no point in the story of the ark does it imply that the animals could survive without eating, so Noah would have had to take eucalyptus trees with him.
Carbon dating is wrong, ok, it is not perfect beyond 30000 years due to environmental contamination, but the bones of the megafauna have been found at aboriginal cooking sites so they definitely co-existed.

Perhaps a flood, say 6000 years ago, wiped out both the aborigines and the megafauna and then the continent was repopulated. The problem is that there is no geological record of a flood of that scale in Australia. Floods leave layers of sediment. Prestine bones of megafauna have been found in the caves of the Nullarbor, where the animals would go in and get lost or fall into earth holes. The bones would have had at least a sediment covering if there had been a flood. There is evidence that the megafauna may have actually been killed off by drought. That is why there is now the limestone caves of the Nullabor, the climate of the Nullabor was once a lot wetter than it is now and when it changed some of the megafauna were unable to adapt in geological time (the modern kangaroo has it's heritage in the megafauna).

I choose to see the Ark story as an allegory and someone who builds one as having too much money and too much time on their hands. And I would like to add that I think it is a crime that they would encourage other people to follow in the same pointless pursuits.

Saturday, September 13, 2008

Could a Python eat a Fox?

Evolving from Foxpro
A while ago I made a big push to convert all the user interfaces to HTML. It was tricky because what I do is to host the interfaces in the _webbrowser while running on the desktop and the same interfaces run through the web. Works great and I can't believe more people aren't into that approach. The other benefit of HTML interfaces is caching, I built a web server into the app so that it can be run as a service and users can access the app across the network. Performance is great.

For desktop apps, I am hoping that 64bit platforms hopefully won't affect my apps too much, I have heard that I should be able to run the app in the 32bit compatible mode (which I haven't tried yet). Run as a COM server I am able to pull the HTML interfaces into .Net. So I can also have a '.Net' frontend when it is a corporate requirement for installation at some sites.

I can write in C# and C++ but in my mind they are not what I would use for writing business type applications. Strong typing is great in it's place but it is annoying as hell when all I care about is building a business rule. VB.Net might be alright to do the job as well, but I am very wary of being locked into a proprietory language again.Enter the Dynamic Language Runtime (DLR) layer for the .NET.

Since the DLR acts as a layer on top of the CLR, dynamic languages that leverage the DLR will have full access to .NET programming libraries written in other languages like C#. So I have the access to other languages when I want them. But the DLR is also supporting non-proprietory languages (Ruby, Python, Javascript).

I am having a good look at Python. I have the skills to use Javascript but Python has an enormous number of support libraries already existing for anything I could want to do with it. It seems better suited to backend processing. I have been a MS shop through and through, so until recently any open source language wasn't going to be an option .... until I read about IronPython and .Net. MS has employed the guy that was heavily involved in Jython (Python on Java) to write IronPython, which is Python in .Net, and made it open source!! Apparently IronPython will run in Silverlight as well. God, another language right, that's all I need, but the more I look at it the closer to Fox it seems.

Related, but separate, is that I am starting to take Google's offerings seriously. Google is doing some revolutionary things that may have very big implications for programmers. What's that got to do with Python? Google has the guy employed that wrote Python, and Python is core to their web applications, and now MS has their own version of the language. That is job security, I think.

So the pathway forward will be to take the business rules and convert them to run in Python. They will probably be written in IronPython, though apparently there are implementations of Python that can be accessed through the MS script control. (Also through ASP pages). This code will be triggered to run from the serverside code in the HTML interfaces. The cursor record is converted into an object and passed into Python for business processing. Hopefully, this code will be generic enough to run in Google web apps as well.

Performance? Check this out ... "IronPython already provides better run-time performance than the conventional C-based Python implementation in some contexts, and, in an interview with eWeek, Microsoft developer Jim Hugunin claimed that the eventual goal for IronPython is to achieve performance that is 'at least two times faster than the native C-based implementation.'"

At the end of the day I have to ask ... what if cloud computing ends up the winner? Am I going to have to rewrite again? I have my own personal reasons for believing that cloud computing has compelling reasons for it to be a serious competitor to the desktop. I have been a loyal MS adherent for many years, but I don't know why MS didn't implement the Fox onto the DLR when they obviously have the ability. It really ticks me off, this is the second time in my career when corporate decisions have led to a language I am using being killed. So no more lock-in, the 'suits' should not control computer languages, it is Javascript & Python for me.

With the new implementation of Python (3000) not being backward compatible with earlier versions, now might be a good time to start to seriously look at the language as it has just had a major overhaul.

Labels:

Friday, May 30, 2008

Compost tea irrigation

There is a lot of interest in aquaponics, and rightly so. It's pretty cool that you can grow stuff from fish pee. But I am surprised at the lack of interest in compost tea irrigation (CTI), because it is so simple a concept. Until I hear otherwise CTI will be the acronym I will use for the process of using compost tea as the basis for growing plants.
This is a picture of the first stage of my setup:


Planter boxes ($14 from Big W) contain seasoned compost. They drain into a 1/2 barrel with a pond liner. The barrel has an aquarium pump in it that simply recycles the water back to the planter boxes.

Why would I want to do this when I have a reasonable amount of space in the backyard to work with? Well, there are 3 reasons temperature, pests and water consumption.
Temperature: I had a sunroom built to passively heat a barn, now it's coming into winter but I want to continue growing vegies. So why not use the sunroom? The picture above is from the start of winter. The snow peas climbing off to the right have now hit the ceiling. Our night time temperatures are about 1 degree C.
Pests: They drive me crazy, snails, slugs, birds and others. I refuse to use chemicals, and while I am working on organic detererents I am spending a lot of time doubling checking whether esquargo is on the menu.
Water consumption: We use drip irrigation with mulch, and that's great, but the water disappears to the water table and the water usage costs money. This can be fixed simply with a water tank and that is certainly on the cards but this way is a lot cheaper. In this setup an aquarium pump on a timer recycles the water in 2 x 15 min periods during the day. It does need to be topped up now and then but it is minimal.


I have to say that this has been the best crop of lettuce I have ever grown. The lettuces in the pic look malformed but that is because I graze on them, rather than picking a whole lettuce I just take some outer leaves now and then.
Stage 2 is taking form it is an expansion of this trial. I'll let you know how it goes.

Tuesday, April 1, 2008

SQL Express and user sessions

So I have my little utility, a port forwarder (with smarts). It reads a database to determine how to forward requests. But I was running into trouble when I implemented a service controller that needed access to the database. I really didn't want to stuff around with installation scripts etc. I just wanted to connect to an mdf file.
More details:
http://msdn2.microsoft.com/en-us/library/bb264564.aspx#sqlexpuser_topic7

It appears that there is scope within SQL server for small utilities that need to access data. There is a type of deployment called xcopy where you simply copy the mdf file to the target and request that SQL starts with a special user instance. The user instance temporarily attaches the mdf file, that is why you may not find your database in the SQL manager.

My problem (amongst many others) is that the database is used exclusively by the user instance of SQL Server. I was creating a windows service, which would run as user "network service" (I think) but my controller interface runs as Interactive User. So it could not connect to the mdf. Once a user instance has been created it holds the mdf file open in case of more incoming requests, unless specifically told to shutdown.

Not that I am driving forwards while looking backwards but I have to say all this would have been handled nicely in Foxpro with:
close database
But NO now I need something like:
private string Detach(string mdfName, SqlConnection sqlConn)
{
if (sqlConn.State == ConnectionState.Closed)
{
sqlConn.Open();
}
SqlCommand cmd = new SqlCommand("sp_dboption", sqlConn);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.AddWithValue("@dbname", mdfName);
cmd.Parameters.AddWithValue("@optname", "offline");
cmd.Parameters.AddWithValue("@optvalue", "true");
try
{
cmd.ExecuteNonQuery();
}
catch (SqlException se)
{
return "SQL exception going offline\n" + se.ToString();
}
catch (InvalidOperationException ioe)
{
return "SQL InvalidOperationException going offline\n" + ioe.ToString();
}
cmd = new SqlCommand("sp_detach_db", sqlConn);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.AddWithValue("@dbname", mdfName);
try
{
cmd.ExecuteNonQuery();
}
catch (SqlException se)
{
return "SQL exception when detaching\n" + se.ToString();
}
catch (InvalidOperationException ioe)
{
return "SQL InvalidOperationException when detaching\n" + ioe.ToString();
}
if (sqlConn.State == ConnectionState.Open)
{
sqlConn.Close();
}
return "";
}

I am sure that some of that is not neccessary, but it worked.

Wednesday, March 26, 2008

Dead wasps sting

My new baby daughter is giving us lessons in life everyday. But here's a new one, don't put a dead wasp in your mouth - they still can sting. Apart from a big fat lip she survived the experience.
I have to say that I admire something that will still fight back even when it is long dead.

Delving into sockets

The job is to receive data on port 80 and determine what URL it was targetting and then redirect the request to the appropriate server.

The purpose being that if you used a service like dyndns you could have lots of URLs but not have to register for lots of IP addresses. Yes, you could load up the one server with processes to respond to the requests but that is not scalable. Yes, you could setup a load balanced server farm, but that's a little too expensive.
Let's just have an internal network of cheap desktop PCs running the apps. The server has a permanent IP address, or could even have a dynamic IP. It is dual-homed with the second NIC connected to an internal switch.
Then you have a database of URL string and target IP combinations.
It is kind of like port forwarding with a little intelligent rerouting inbetween. I am calling it portSAF for port Split And Forward. I am sure that someone has already done this but I couldn't find it so here we go.
So the first step is to receive requests on port 80. To do that we need to use multithreading.
In the form_Load(){
....

alSockets = new ArrayList();
thdListener = new Thread(listenerThread);
thdListener.IsBackground = true;
thdListener.Start();
....
}
We need to set .IsBackground so that when the form closes the threads are terminated as well.

public void listenerThread()
{
Int32 port = 80;
TcpListener tcpListener = new TcpListener(IPAddress.Any, port);
tcpListener.Start();
for (; ; ){
if (tcpListener.Pending()){
Socket handlerSocket = tcpListener.AcceptSocket();
if (handlerSocket.Connected){
lock (this)
{
alSockets.Add(handlerSocket);
}
Thread thdHandler = new Thread(handlerThread);
// When the foreground thread terminates so should this one.
thdHandler.IsBackground = true;
thdHandler.Start();
}
}
Thread.Sleep(10);
}
}
The listenerThread method listens for new connections, when one is detected a socket is created and put on an array. Then a new thread is started it will use the last socket from the array.

The use of Thread.Sleep() was to stop the CPU usage gonig to 99%