Georgia Investment Network


Recent Blogs


Pitching Help Desk


Testimonials

"Our small, early-stage company recently signed up for your service. We got numerous inquiries, several of which we are pursuing, and hopefully will find an investor partner as a result. It is almost impossible for young companies to attract investment capital in the current financial climate, but you managed to bring a number of qualified and interested parties to the table. I would recommend your service to any early-stage company seeking capital. Bruce Jones, CFO "
Bruce Jones

 BLOG >> September 2013

Measuring Startup Success [Startups
Posted on September 27, 2013 @ 08:28:00 AM by Paul Meagher

Tom Bloomfield has a useful blog post on measuring startup success. Citing Paul Graham he argues that, "merely measuring something has an uncanny tendency to improve it".

Tom makes some useful suggestions such as:

  • Start by measuring one thing. You need some key metric to track whether you are improving or not.
  • Choose a proxy for long term value. Because startups may not be profitable from the get go, it is important that you still figure out what measure needs to be constantly improving to help guarantee the long term survival and growth of your startup.
  • Break down you key metric into Key Performance Indicators that track sub-activities that need to be performed if you are to meet your improvement goals. These should be objective measures of performance that are shared within the company among all employees.
  • Use your KPI planning to implement a business strategy.

Permalink 

Intro to Linear Programming: Part 2 [Decision Making
Posted on September 26, 2013 @ 07:59:00 AM by Paul Meagher

In yesterday's blog, I gave a brief overview of what Linear Programming is and the main equations that are used to specify a Linear Programming decision problem.

Today I want to provide you with some resources you might use to learn how to use Linear Programming to optimize some aspect of your business.

The first resource that I would direct you to are a series of IBM developerWorks articles on using the GNU Linear Programming Kit (GLPK) to solve linear programming problems. This 3 part series walks you through 5 different problem scenarios that you can apply linear programming to and how to specify the equations in a format that the GLPK Solver can work with.

The second set of resources I would point you to are the links to free opensource software that can be used to solve linear programming problems (often referred to as "LP Solvers").

The two main high-qualtiy opensource LP Solver packages that I am aware of are the GLPK package mentioned in the article links above and the LPSolve package. Here are a couple of links to learn more about these packages.

  • Wikipedia Page for GLPK. The page has useful links to various implementations of GLPK, including a javascript-based one that looks interesting as it would be easy to embed in a web page.
  • LPSolve Reference Guide. In my opinion, LPSolve's main feature is that is has bindings for many popular programming languages which makes it easy to call from these languages. Also, the documentation is quite good.

The 3 tutorial links and the links to 2 high-quality LP Solvers should help you to get started in learning more about the nuts and bolts of Linear Programming and whether it might be useful in the context of optimizing some aspect of your business.

Permalink 

Intro to Linear Programming: Part 1 [Business Models
Posted on September 25, 2013 @ 10:28:00 AM by Paul Meagher

Today I want to begin introducing you to a powerful optimization technique you might have occasion to use in your business. The technique is called Linear Programming and is the primary technique taught and used in Operations Research. Wikipedia defines Operations Research as "a discipline that deals with the application of advanced analytical methods to help make better decisions. It is often considered to be a sub-field of mathematics. The terms management science and decision science are sometimes used as synonyms".

Linear Programming is a fairly difficult technique to master because it involves some advanced math and the ability translate a business problem into a set of equations representing what it is you want to optimize (i.e., maximize or minimize the value of) and the constraints that exist upon how you can solve the problem (i.e., subject to constraints on labor, capital, machinery, time, etc...). The ability to translate a business problem into a set of equations is generally acquired by studying standard types of business problems that Linear Programming has been applied to and then using those example solutions as a template for applying the technique to your own similar situation.

In today's blog, I want to simply display the main set of formulas that are used in linear programming. I am reproducing the main linear programming formulas from the book Mathematical Programming for Agricultural, Enviornmental, and Resource Economics by Harry M. Kaiser & Kent D. Messer, Wiley, 2012. This was also an opportunity for me to take the MathJax library I setup yesterday for another test drive and learn more about how to add some professional looking math to my blog.

Here is the general form of the Linear Program (LP) model. First we need to define the objective function $Z$ that we want to maximize or minimize.

\[Z = c_1x_1 + c_2x_2 + ... + c_nx_n\]

Then we need to specify the various business constraints (labor costs, material costs, transport costs) that our business decision is subject to. These constraints are formulated in terms of equations with a left hand side being less than, equal to, or greater than some value in the right hand side:

\[a_{11}x_1\ + a_{12}x_2 + \cdots + a_{1n}x_n \lbrace {\le, =, \ge} \rbrace b_1\]

\[a_{21}x_1\ + a_{22}x_2 + \cdots+ a_{2n}x_n \lbrace {\le, =, \ge} \rbrace b_2\]

\[\cdots\]

\[\cdots\]

\[a_{m1}x_1\ + a_{m2}x_2 + \cdots+ a_{mn}x_n \lbrace {\le, =, \ge} \rbrace b_m\]

Finally, we generally add the constraint that all "activities" are non-negative (we can't have negative activity values in our production model).

\[x_1, x_2, \cdots x_n \ge 0\]

We'll explore the technique of linear programming in further detail in my next blog

Note: If you want to see the tex code used to generate the equations in this blog, all you have to do is right-click on the equation and MathJax supplies a viewer application for inspecting and copying the formulas.

Permalink 

MathJax Testing [Site News
Posted on September 24, 2013 @ 02:16:00 PM by Paul Meagher

MathJax is a javascript library (www.mathjax.org) that makes writing math on the web easier and more professional looking. It is generally not difficult to install or get working. I had issues with escape "/" characters and php's stripslashes function not working together very well and I also needed to add a config option for dealing with escape delimiters. After these issues were resolved, MathJax appears to work as advertised on their site.

To take MathJax for a tour, I first downloaded the "source code" for a classic textbook, Introduction to Probability, by Grimstead and Snell. The "source code" I downloaded was the tex code used to write the book with. I figured that one way to learn how to use the tex language to write math symbols would be to see how it was used in a beautifully typeset textbook whose content I want to read.

So here are a few sample sentences taken from the textbook, now reproduced online using MathJax:

Let $X$ be a numerically-valued discrete random variable with sample space $\Omega$ and distribution function $m(x)$. The expected value $E(X)$ is defined by $$ E(X) = \sum_{x \in \Omega} x m(x)\ , $$ We often refer to the expected value as the mean and denote $E(X)$ by $\mu$ for short.

The probability of getting \(k\) heads when flipping \(n\) coins is:

\[P(E) = {n \choose k} p^k (1-p)^{ n-k} \]

P.S. I had the opportunity to correspond with one of the authors of the Introduction to Probability textbook, Laurie Snell, a few years back on a few probability concepts (Markov Processes, Analysis of Repeated Surveys, and Chi Square analysis). Snell was around 80 at that time but still very active in the probability community spearheading the Chance News project. Snell also collaborated with Kemeny on writing a Finite Math textbook. Kemeny was the originator of the BASIC programming language.

Permalink 

Good Advice on Fund Raising [Venture Capital
Posted on September 19, 2013 @ 08:44:00 AM by Paul Meagher

Paul Graham is one of the cofounders and the leading voice for Y Combinator. Since 2005, Y Combinator has funded over 450 startups, including Dropbox, Airbnb, Stripe, and Reddit. Paul's latest blog posting, How to Raise Money, offers excellent and authoritative advice for anyone trying to raise money for their startup. Below is a listing of his main points. To obtain more details on each point you should read the full blog:

  • Don't raise money unless you want it and it wants you.
  • Be in fundraising mode or not.
  • Get introductions to investors.
  • Hear no till you hear yes.
  • Do breadth-first search weighted by expected value.
  • Know where you stand.
  • Get the first commitment.
  • Close committed money.
  • Avoid investors who don't "lead."
  • Have multiple plans.
  • Underestimate how much you want.
  • Be profitable if you can.
  • Don't optimize for valuation.
  • Yes/no before valuation.
  • Beware "valuation sensitive" investors.
  • Accept offers greedily.
  • Don't sell more than 25% in phase 2.
  • Have one person handle fundraising.
  • You'll need an executive summary and (maybe) a deck.
  • Stop fundraising when it stops working.
  • Don't get addicted to fundraising.
  • Don't raise too much.
  • Be nice.
  • The bar will be higher next time.
  • Don't make things complicated.

Permalink 

The Sharing Economy [Trends
Posted on September 17, 2013 @ 07:06:00 AM by Paul Meagher

Janelle Orsi is a lawyer specializing in supporting businesses and non-profits that share services and resources. In a recent Post Carbon Institute article called The Sharing Economy Just Got Real she discusses what the sharing economy is and some of the legal grey areas that sharing companies operate within. It is worth a read to learn more about this important economic trend and how it might legally evolve in the future.

Below is video promoting the idea of sharing resources at the community level and all the good things that flow from such sharing.

The sharing economy is one trend for entrepreneurs and investors to keep and eye on as it has the potential to be very disruptive and potentially economic for those developing the sharing platform.

Permalink 

Processing Decisions [Decision Making
Posted on September 12, 2013 @ 09:14:00 AM by Paul Meagher

In the last few blogs I've discussed using Graphviz to generate nice looking decision trees. Sometimes, however, it is difficult to get exactly what you want out of Graphviz because it is designed to generate graphs dynamically (easy to change values/labels and generate a new graph) rather than uniquely crafted one-offs. I decided to explore other alternatives to creating decision tree graphs and, to make a long story short, I have decided that a programming language called "Processing" (see http://www.processing.org website) offers the flexibility I need along with many other benefits. Processing opens up the possibilities for visualizing decisions exponentially because it can be used to create multimedia output, including graphs, on a huge range of devices (e.g., desktop browers, mobile devices, embedded devices). It is also a very elegant language and has a javascript+html5 implementation that makes executing "sketches" (e.g., the processing term for a program) in a webpage a breeze (the output of which can be static, animated, sonically enhanced, etc...). It also has an excellent development environment bundled with it and a large opensouce developer community. Finally, there are some extremely well written books on learning and using the language. The one to start with is by the language's authors:

Processing: A Programming Handbook for Visual Designers and Artists
Casey Reas and Ben Fry (Foreword by John Maeda).
Published August 2007, MIT Press. 736 pages. Hardcover.

To give you a flavor of the language I'll offer up a couple of processing sketches from the book above. I was looking for some code that would get me started on drawing a tree and found this code (p. 202) for drawing a T.

The code above generates a simple tree structure that could be the starting point for a decision tree:

Cool! By making a few modifications to this program (e.g., adding recursive calls to the drawT function) a fractal tree can be generated with the following code:

Here is the output that the sketch above generates:

This, of course, is not a full-bodied decision tree but it gives us some insight into how the skeleton of a simple binary decision tree might be created - by calling a drawT function multiple times with the appropriate positional parameters. I'd prefer a left-to-right layout rather than a bottom-to-top layout so that connections can be labelled easier.

Permalink 

Devil is in the Details [Decision Trees
Posted on September 10, 2013 @ 07:25:00 AM by Paul Meagher

In my previous blog, I showed how to construct a nice decision tree for a decision about how much nitrogen to apply to a crop. In this blog, I want to advance our thinking about decision trees in two ways:

  1. Show how expected returns can be calculated using PHP.
  2. Discuss the issue of how detailed we should get when constructing a decision tree.

Computing Expected Return

In my blog titled Computing Expected Values I referred you to a video tutorial on how to calculate expected values. In this blog, I will implement that calculation in a PHP script. Implementing the calculation programmatically allows us to see what types of data structures need to be defined and how they looped over in order to compute expected returns. We need a data structure to represent our actions (i.e., a $nitrogen array), our events (i.e., a $weather), our outcomes (i.e., a $payoffs matrix), and to store the expected returns that are computed for each action option (i.e., an $EV array). With these basic elements in place we can compute our expected values in a straightforward manner as illustrated in the code below:

The output of this script looks like this:

Array
(
    [lo] => 6900
    [med] => 7900
    [hi] => 8900
)

These are the expected returns for low, medium, and high levels of nitrogen application and correspond to the expect returns that appeared in the decision tree appearing in my last blog.

Levels of Detail

The decision tree we have constructed to represent a nitrogen application decision is vague in many of its details and, as such, would be difficult to use for the purposes of making an actual decision about whether to apply nitrogen or not.

Our biggest omission is to just talk about an "expected return" without talking specifically about whether this is expected revenue, expected profit, or expected utility. If our payoffs are expected revenue amounts then our decision tree is not going to be that useful because it hides the costs involved. For this reason, the expected profit would be a better value to compute as our "payoffs" rather than expected revenues. Theoretically, an even better value to compute would be the expected utility associated with each action option but that is a tricky value to compute because it depends on subjective factors and more complex formulas. For this reason, we can be satisfied if we can at least compute expected profits for each decision option.

Another omission in our decision tree is any discussion of the costs associated with each proposed action. In order to compute such costs we must get detailed about the when, where, and how of applying nitrogen. We also need to estimate the price of nitrogen at the time of application. If we have already purchased our nitrogen then this would simplify our calculations. Other costs include the cost of fuel to apply our nitrogen. We also need to be specific about what crop we are applying our nitrogen to. In order to compute expected profits we would need to compute some other costs associated with planting, cultivating, and harvesting the crop per acre so that these can be subtracted from the overall revenue generated to compute our expected profits.

Our nitrogen application decision is impacted by weather which we have characterized as poor, average, or good. This is also not very precise and would need to be specified in more detail. Weather could specifically mean rainfall amounts in the spring phase of the year.

Once we get very specific about costs and what our variables specifically refer to, then our decision tree would provide better guidance on how to act. The visual depiction of a decision as a decision tree helps to organize our research efforts but it omits much of the research work that will have gone into making the decision tree useful and realistic.

Permalink 

Decision Tree Subgraphs [Decision Trees
Posted on September 5, 2013 @ 07:55:00 AM by Paul Meagher

I experimented with two Graphviz features that I thought might improve the appearance of my decision trees:

  1. I wanted the connecting lines to be rectilinear rather than curvilinear. Unfortunately, I am not able to achieve this effect when I use labelled edges; rectilinear connections only works when labels are applied to nodes, when labels are applied to edges it may be more difficult to calculate line placement. Rectilinear connections might have looked better but I'll have to live with curvilinear connections because I prefer labelled edges for the decision trees I'm exploring right now.
  2. I wanted to separately highlight actions, events, and outcome sections of the decision tree. I found a way to do this for the outcome nodes but have not found a way to apply further labelling or highlighting to labelled edges.

Todays blog shows how to use the "subgraph" feature of the dot language to highlite nodes that are related in some way. I used the subgraph feature to better highlite what the payoffs were for each separate action. The possible payoffs associated with each action option are distinguished by having a different color and bounding box for each set of payoffs.

Notice that the payoffs associated with each action are separately highlited and that I also add up the payoffs for each set of payoffs and report it as the expected value (EV) for that action. Here is where I get into calculating expected values manually because the dot language is not a general purpose programming language. For that I'll be using PHP in my next blog to compute expected values and supply them to the dot file that is generated.

Here is the dot file that was used to generate a decision tree that uses subgraphs to highlite sections of it.

Permalink 

Turtle Island Preserve [Nature
Posted on September 4, 2013 @ 08:43:00 AM by Paul Meagher

I'm watching a couple of documentaries this morning on Eustace Conway. Eustace is one of the featured people on the Mountain Men Reality TV series. I wanted to learn more about what Eustace's property, the Turtle Island Preserve, is all about. The reality TV series does not give one a good sense of how Turtle Island works as a whole. This documentary gives that overview and is quite entertaining, informative, and well done.

Permalink 

Improved Decision Tree Layout [Decision Trees
Posted on September 3, 2013 @ 08:34:00 AM by Paul Meagher

To date I have not been fully satisfied with how my decision trees have appeared. They did not appear to use space efficiently and they were not as easy to read as I would have liked. Today we will make some improvements to a Graphviz recipe for constructing decision trees. These improvements will make for a more space efficient decision tree that is also easier to read.

The main improvements I have made are:

  • All of the labelling for actions and events appears on the edges instead of the nodes. In previous examples, most labelling was done at the nodes.
  • There is no labelling at all between actions and events, just a small connector shape.

The combination of these two improvements means that 1) it is easier to read the graph as all the edges are labelled and the flow is oriented in a left-to-right fashion, and 2) the layout is more space efficient as the nodes connecting actions to events takes up alot of space when they include labelling. Now, we have only small connector shapes with no labelling.

I'll illustrate the improved decision tree in the context of a decision about how much nitrogen to apply to a crop per acre that involves calculating the payoffs you might expect if you get Good, Average, or Poor weather during the growing season. This is what such a decision tree looks like with the improvements mentioned above:

Nitrogen application decision tree

There are a couple of other aspects of this decision tree that are also noteworthy. First, the labels for each possible action (e.g., Nitrogen application amounts) includes the cost per acre of applying that amount of Nitrogen. One aspect of constructing a decision tree is computing the cost for each course of action. The second aspect to note is that the terminal nodes on our decision tree are often payoffs that involve multiplying an estimate of revenue by the probability of some event that significantly affects the payoff (e.g., the quality of the weather during the growing season).

I created the visualization for the nitrogen application decision using Graphviz. To do so I piped the recipe below into the graphviz program "dot". The recipe illustrates how to add comments to your dot file to make it easy to follow your recipe for rendering a graph shape. The recipe is also organized into logical sections to also make it easier to read.

Figuring out how to render decision trees with labelled intermediate edges instead of labelled intermediate nodes was a big step in creating a decision tree format that I find is more workable. I'm not done yet, however, as I want to explore some other features of graphviz to add some more tweaks to my decision tree.

Permalink 

 Archive 
 

Archive


 November 2023 [1]
 June 2023 [1]
 May 2023 [1]
 April 2023 [1]
 March 2023 [6]
 February 2023 [1]
 November 2022 [2]
 October 2022 [2]
 August 2022 [2]
 May 2022 [2]
 April 2022 [4]
 March 2022 [1]
 February 2022 [1]
 January 2022 [2]
 December 2021 [1]
 November 2021 [2]
 October 2021 [1]
 July 2021 [1]
 June 2021 [1]
 May 2021 [3]
 April 2021 [3]
 March 2021 [4]
 February 2021 [1]
 January 2021 [1]
 December 2020 [2]
 November 2020 [1]
 August 2020 [1]
 June 2020 [4]
 May 2020 [1]
 April 2020 [2]
 March 2020 [2]
 February 2020 [1]
 January 2020 [2]
 December 2019 [1]
 November 2019 [2]
 October 2019 [2]
 September 2019 [1]
 July 2019 [1]
 June 2019 [2]
 May 2019 [3]
 April 2019 [5]
 March 2019 [4]
 February 2019 [3]
 January 2019 [3]
 December 2018 [4]
 November 2018 [2]
 September 2018 [2]
 August 2018 [1]
 July 2018 [1]
 June 2018 [1]
 May 2018 [5]
 April 2018 [4]
 March 2018 [2]
 February 2018 [4]
 January 2018 [4]
 December 2017 [2]
 November 2017 [6]
 October 2017 [6]
 September 2017 [6]
 August 2017 [2]
 July 2017 [2]
 June 2017 [5]
 May 2017 [7]
 April 2017 [6]
 March 2017 [8]
 February 2017 [7]
 January 2017 [9]
 December 2016 [7]
 November 2016 [7]
 October 2016 [5]
 September 2016 [5]
 August 2016 [4]
 July 2016 [6]
 June 2016 [5]
 May 2016 [10]
 April 2016 [12]
 March 2016 [10]
 February 2016 [11]
 January 2016 [12]
 December 2015 [6]
 November 2015 [8]
 October 2015 [12]
 September 2015 [10]
 August 2015 [14]
 July 2015 [9]
 June 2015 [9]
 May 2015 [10]
 April 2015 [9]
 March 2015 [8]
 February 2015 [8]
 January 2015 [5]
 December 2014 [11]
 November 2014 [10]
 October 2014 [10]
 September 2014 [8]
 August 2014 [7]
 July 2014 [5]
 June 2014 [7]
 May 2014 [6]
 April 2014 [3]
 March 2014 [8]
 February 2014 [6]
 January 2014 [5]
 December 2013 [5]
 November 2013 [3]
 October 2013 [4]
 September 2013 [11]
 August 2013 [4]
 July 2013 [8]
 June 2013 [10]
 May 2013 [14]
 April 2013 [12]
 March 2013 [11]
 February 2013 [19]
 January 2013 [20]
 December 2012 [5]
 November 2012 [1]
 October 2012 [3]
 September 2012 [1]
 August 2012 [1]
 July 2012 [1]
 June 2012 [2]


Categories


 Agriculture [77]
 Bayesian Inference [14]
 Books [18]
 Business Models [24]
 Causal Inference [2]
 Creativity [7]
 Decision Making [17]
 Decision Trees [8]
 Definitions [1]
 Design [38]
 Eco-Green [4]
 Economics [14]
 Education [10]
 Energy [0]
 Entrepreneurship [74]
 Events [7]
 Farming [21]
 Finance [30]
 Future [15]
 Growth [19]
 Investing [25]
 Lean Startup [10]
 Leisure [5]
 Lens Model [9]
 Making [1]
 Management [12]
 Motivation [3]
 Nature [22]
 Patents & Trademarks [1]
 Permaculture [36]
 Psychology [2]
 Real Estate [5]
 Robots [1]
 Selling [12]
 Site News [17]
 Startups [12]
 Statistics [3]
 Systems Thinking [3]
 Trends [11]
 Useful Links [3]
 Valuation [1]
 Venture Capital [5]
 Video [2]
 Writing [2]