Skip to main content

Posts

Showing posts from October, 2013

What were you doing when you hit 10,000 hours?

I have 10,000 hours just about the time that I started working for NaBanco. I was the first programmer hired to build a brand-new gift card system which was eventually called ValueLink. I had just come off of string of successful projects working as a contractor at IBM. Those projects were mostly hardware and firmware-based. Making the transition to server-based applications required a new way of thinking.

I was required to work on a brand-new series of technologies. That included Solaris, Informix, Oracle's SQL, Windows desktop programming, IVR programming, Java, perl, X.25, IBM's JCL implementation for OS/2, Java, REST, Web server development, and ZON.
Keep in mind this was 1993 and I had spent most of the last five years writing assembly language programs. 
I can only conclude that this meets Malcolm Gladwell's criteria for 10,000 hours.

This damn small print ever

I was watching a Disney commercial the other day and saw the small print. It reads: do not attempt unless you are a mermaid

Gollang package management

In recent months many Geoglein aficionados have been complaining about the current state of package management system. At first I did not hold the position of my own until I read some articles from the authors of the Gollang system. As they were the experts I agreed with them more readily than some of the neophytes who were arguing for different types of package management. Now my position is basically the same however for a much different reason. 
Before producing a third-party library one usually puts together a set of requirements. These requirements are essentially a contract between the library provider and it's end-users. It's my contention that the contract should be immutable once it is stabilized. Therefore if the objective is to make the library as stable as possible and does conformant to the initial requirements as possible therefore various types of version management should be unnecessary. At this point one should meet only to fork the library or application as ne…

Continuous integration-the buzzword du jour

Build and test cycles integrated into the standard makefile have been around for many years. Triggered builds based on commitments have been in production for many years too. To give that process a name like continuous integration is admirable since it conjures up instant recognition or understanding.
However, I am starting to get a sense that continuous integration is being overused and marginalized and productize by marketing departments in order to sell services or garner favor.

Is Apple moving OS X closer to iOS?

One of the challenging activities for anybody who wants computer is the database maintenance. It's even worse when the maintenance needs to be performed on computers for children or parents. If you're the computer professional and the family and there usually is one of them then your job is maintenance and upgrades.
The best day of my life was when my mother-in-law bought an iPad. Except her laptop is still having problems and her iPad running out of memory because she didn't buy a big enough I pet.
But close enough.
If Apple developed MacBook Airs that were running iOS than that might be the sweet spot.

Siri said what?

Siri has not been performing her best today. Many times it takes quite a while for her to translate my voice into text. It's a great feature however I believe that the phone itself is not performing the task it must be going up to the cloud somewhere for processing. If that's the case then what happens to all this text and context as it applies to networks and social activities and contacts etc.? This is different then capturing search terms. This is different than capturing emails. This is capturing life as it happens. Many espionage movies talk about having open microphones via the cell phone and capturing conversations across the globe. This can't be too far from reality.

LinkedIn Endorsements are Seriously Flawed

The more I endorse a single person on LinkedIn the more that person seems to endorse me or the more LinkedIn seems to solicit me for another endorsement for that person. LinkedIn would appear to be using a feedback loop to decide which contacts to prompt for rather at random.


I once thought that Docker was a good idea

... but now I have lost that loving feeling.

The problem is that it is simply not that easy to setup beyond hello world. Any real work starts to generate vast amounts of deltas that need to be stored. Storage requirements in terms of the amount of deltas that the containers create may not be predictable.

As I consider the overhead of VirtualBox, VMware, Parallels etc... Docker is less predictable. So there are a number of things you need to do before starting...


lots of local disk spaceuser dockerfiles so that you can start from scratch repeatidly reducing all the cruftget a clear sense of how it works and where all the artifacts reside on the host OS (/var/lib/docker)get the real scoop on how to clean up un every condition... and there are many and some with nulls. Or just do something else for now. Check out the Parallels Vurtuoso project which is containers too.

Convention over configuration or vice versa?

The implementation details for configuration over conventional or vice versa is trivial. The real question is which one is going to give you the best long-term advantages.
Configurations can be shared across the application space as new applications are brought on board all of that information is shared. There is a disadvantage that the configuration has to be consulted as the code segments are reverse engineered.
Conversely five convention simply means that the configuration information is stored within common classes. This is just an object representation of the configuration information.
The only advantage of convention over configuration appears to be in naming of accessor functions. Whereas one uses get in set for the configuration method. In the by conventional strategy the data is accessed by named methods.
A hybrid approach would be novel except that would have the side effect of producing additional code that would need to be tested debugged and so on.

Cogeneration and the implementation with continuous integration

Having a single developer or small team it's easy to conceive that the cogeneration step could be done as a pre-commit step for any sort of version control system. In slightly larger teams the generated artifacts would merge no less easily than the source files. This does create some problems when the files require additional merging or correction to account for conflicts. And when the teams get larger it's even more complicated to time the bills and to maintain consistency across generated code segments.
I have been looking at some of the continuous integration and deployment solutions. Including some of those that are hosted. And many of those cases there is no consideration for generated code. One fact that falls out of the Gollang environment is that one of the initial requirements was to eliminate the makefile. While Gollang was successful, one still needs batch files in order to build generated applications.

Why I hate frameworks

One of the classic problem sets for programmers is selecting the right tool for the job. Typically that means choosing the right programming language which is closest to the problem set. 
Joel Sapolsky's blog is running a story discussion about why I hate frameworks. I think he got it wrong!
There are two very strong reasons for using frameworks. The first reason is that frameworks tend to handle the impedance between the problem and the programming language. Second when you own the framework it solves more problems than it creates.

GoLang - no make files

I just read the one of the design principles of GoLang was that the authors did not want to use make files. This in itself is an honorable goal as anyone who has worked with the autotools knows. However, the one thing that I have realized is that since my project includes a code generator that is built in Go and that includes various static and dynamic artifacts as any web application does... delivering one and only one artifact is not enough and so I'm developing a number of potential strategies:

Store the artifacts in a SQLite DB table and import as needed into a cacheFind a way to convert the files into code that can be compiled into the codeDeliver 3 artifacts and import on demand... a) the executable, b) the VCS repository, c) an intermediate SQLite DBI'm currently leaning toward #1 and in this way I only have to deliver the executable and the DB, however, there is an advantage to delivering the source in VCS too. This means that wherever you are there is potential to read…

First candidate for a Gollang idiom

When developing a package in Gollang and that package is made up of several  multiple source files. If more than one source file imports from unknown package whether that's a third-party library or core library then those APIs are candidates for a functional layer within the package itself.
The intent is to reduce the number of files being imported. This is partially to make the code more readable but it's also to eliminate some of the redundant editing as the feature functions are added and removed from the package itself.

Code generator workflow

When working with code generators the hardest question to answer is whether or not the generated code should be stored in your version control system during the development cycle instead of the build cycle?
I don't have a very good answer for this question because when one considers continuous integration the answer is further complicated. Answer the comes to mind first or maybe second is that the code that is being generated should be in a separate project from the code generator. However, when there is a recursive relationship between the cogenerator the library the code generator uses in the generated code then it gets more complicated when trying to decide where the generated code resides.
Maybe I'll answer this next time. But your comments are welcome.

"... plus any applicable metaprarmeters"

I do not like much about puppet and chef. My biggest aversion is that they are both written in ruby which has a tremendous side effect on sysadmins, however, that's a different story. 
I do not like DSLs either. They are rarely complete and as designed they capture the domain knowledge at the expense of competition and innovation.
However, one thing I really like is the puppet type reference implementation. It's part of their DSL and very well documented. I'm thinking about implementing them in GoLang as I need them. My only concern is that the library is extensive and there is no way to know when one is feature complete. Additionally, at least one reference includes "... plus any applicable metaprarmeters" and that's just dirty!

Triggering hey Bill when a source code repository file changes is less than ideal

Fossil source code manager is a very promising tool however it does not provide hooks that could be used to trigger a build upon commit. One possible solution is to watch the file system and look for updates to the file in both size and/or time. The challenge however is when the file is under load by multiple users or multiple commitments. The filesystem is going to trigger the watch process on the first change plus or minus whatever time interval is configured. As a result the second update might not actually get performed as part of the current build depending on the interleave between the trigger and the actual cloning of the original repository. Additionally if the actual commit taking place by the third and fourth parties is a number or series of commands that could adversely affect the build process as well.
Other strategies like waiting for the files to settle after having determined that there was a change is plausible however the delay from the time of the trigger to the actua…

"technically" is the new "clearly"

Recently I was listening to a speaker in the "big think" series on youtube. One criticism the speaker had was the use and overuse of the word clearly. To paraphrase he/she said that "clearly" was an objectionable word in a convincing argument because ... it things are not as obvious or clear.

That got me to thinking about the overuse of the word "technically". In many cases "technically" can be replaced with the word "truthfully" and yet in either case the intent is to express the truthiness(not a real word) of a particular statement.

I assert that people who use "technically" are misuse the word. It more likely means that a statement is truthy because of some technicality or side effect rather than for some bonified technically reason. A personal making a technical argument does not usually say "for technical reasons" or "technically".

Conjuring another line of reasoning; writing code or operating in the edg…

Why I'll Never Publish a Serious Open Source Project

I have worked on a number of projects that I have thought would have been successful open source projects or might provide some mutual benefit but in the end I decided not to expose these projects. As a professional programmer and architect I simply do not have enough time in the day to write code, evangelize,  and more importantly, defend my projects from hostile takeovers.

One thing I've noticed is that developers are already forking projects en-masse. Even in my own professional endeavors I fork projects simply to insure version compatibility and avoid the horrors of TIP creep.

Of course if things got out of control I could always fork my own projects. But even that feels deconstructive.

Mike Rowe "innovation without imitation"

If I understood Mike Rowe's comments he meant that "imitation" was the mundane and repetitive manufacturing jobs. The sort that we hear about in the news when talking about Apple and now Sony manufacturing; Foxconn.

I started a thought experiment.  What if the like of Apple, Sony, Samsung, Microsoft ... changed their approach and worked on the software making it faster, lighter, smaller, portable. At what point could we stop caring about Moore's Law?

Moore's Law is less of a technological predictor as it is a marketing schedule.

Object oriented namespace fractal dimension

Back in the early days of assembly language programming and even some of the earlier basic programming the idea of object-oriented programming was never a consideration. Everything from  variable names to subroutines all existed in the same namespace. Software in the 70s and 80s work just as well as their counterparts today however modern programming languages have partitions libraries and access to functions and variables by using namespaces. 
The current project I'm working on emphasizes the notion of global namespace is a global variable names global functioning. The names themselves of becoming a mod the old days of COBOL when artifact names are much larger than they are typically today.
While name spaces can Clyde it doesn't typically happen the reality is that the names are predictable and there's absolutely no reason why they should ever overlap. Because they're predictable one can develop an internal mechanism for predicting the way names and functions and artifa…

Are you raising the bar with agile?

Using agile methodologies to measure the effectiveness of agile is like letting the fox guarding the hen house. This is no different than letting individuals who are either vested one way or the other make the same measurements. In order to measure the effectiveness of agile one needs to look at the scientific method in order to determine the right methodology for measuring it's effectiveness.
Just because the authors of the agile manifesto are also the authors and purveyors of agile process improvement does not necessarily mean agile is going to work for your environment. The programmers consultants and business people who define the agile manifesto used to model their behavior in their environments. The thing that people continue to forget about this environment is that these individuals were really smart.
The idea that there methodologies could be applied to the rank-and-file entry-level programmer and you same results is just simply false. One cannot substitute multiple decades …

The view is nothing but a view

Toy around with the Golang template framework one realizes that the view is just a view. While there is some support from basic conditionals and loops you really cannot mutate any of the data in anyway by default. If you want to transform any data your template you need to import local functions. This is not entirely a bad thing but probably undesirable from an idiom perspective.
After working with the templates for almost 2 weeks now I come to the conclusion that is better the templates are just used strictly to expand your data into the forms and format that they are being visualized through.
Once the templating and the user facing GUI are co-mingled enhancements are nearly impossible. But that's just my intuition.

The human environment adapt to the human faster than the human adapts to its environment

This might not make any sense at all but as I think about the brain eating amoeba, The flesh eating bacteria, human eating bacteria; I start to wonder whether or not these things are in our environment because we have changed our environment and they have adapted to survive. And their ability to adapt is faster than our own.
Without any real statistical evidence I wonder why we're hearing more about the edge cases of our own mortality as our mortality rates drop.

iPhone video compression

When you send a video clip from your iPhone the iPhone application compress the video before sending. So the question is why doesn't the phone compress the video in the background or if some more opportune time. Waiting until the video is being transmitted does two things. It takes too long to initiate transmit. And the storage required for the original media. It's quite possible that Apple is now in the storage business and not the phone business.

The Great Equalizers

This is an email I wrote recently.
SUBJECT: this is worth a reread it might help with the whole zero downtime issuehttp://en.wikipedia.org/wiki/Database_normalizationI think that a more normal DB and an ORM layer will make stored procedures obsolete except for some rudimentary functions. But it's still going to take some discipline along with some deep acceptance and understanding. However one thing I have discovered about "things: The internet became one of the great equalizers where anyone can profess to be as smart as Date, Coad, Yordon, Knuth and few will disagree. I've also learned that Scrum, Kanban, and particularly Agile are also great equalizers. The team has a set point and all members have equal say. Even the most junior and uninformed members can significantly lower the set point.It was in response to a comment made my a college student in response to a comment I had posted. The student tried to impose his opinion; not because he know more or had some clear and co…