Found an interesting approach to visual programming in Smalltalk
It looks pretty cool, and the available examples are understandable. the question is the traditional one... does it scale?
The Year of the Thesis
Mircea writing things related to his thesis, so he won't forget them. One has many things floating through his head in the year of the thesis :)
Wednesday, February 11, 2009
A Visual Syntax for Smalltalk
Thursday, January 29, 2009
bibliography
bibliographies - on diomidis spinellis' homepage
classics: http://www.dmst.aueb.gr/dds/bib/classics.htm
code quality/reading: http://www.dmst.aueb.gr/dds/bib/coderead.htm
kernighan on tools
What do we want from our tools? First and fore- most is mechanical advantage: the tool must do some task better than people can, augmenting or replacing our own effort. Grep, which inds patterns of text, is the quintessential example of a good tool: it’s dead easy to use, and it searches faster and bet- ter than we can. Grep is actually an improvement on many of its successors. I’ve never igured out how to get Visual Studio or Eclipse to produce a compact list of all the places where a particular string oc- curs throughout a program. I’m sure experts will be happy to teach me, but that’s not much help when the experts are far away or the IDE isn’t installed.
That leads to the second criterion for a good tool: it should be available everywhere. It’s no help if SuperWhatever for Windows offers some won- derful feature but I’m working on Unix. The other direction is better because Unix command-line tools are readily available everywhere. One of the irst things I do on a new Windows machine is in- stall Cygwin so that I can get some work done. The universality of the old faithfuls makes them more useful than more powerful systems that are tied to a speciic environment or that are so big and com- plicated that it just takes too long to get started.
The third criterion for good tools is that they can be used in unexpected ways, the way we use a screwdriver to pry open a paint can and a hammer to close it up again. One of the most compelling advantages of the old Unix collection is that each one does some generic but focused task (search- ing, sorting, counting, comparing) but can be end- lessly combined with others to perform complicated ad hoc operations. The early Unix literature is full of examples of novel shell programs. Of course, the shell itself is a great example of a generic but focused tool: it concentrates on running programs and encapsulating frequent operations in scripts.
http://www.reddit.com/r/programming/comments/7rf9p/sometimes_the_old_ways_are_best_by_brian_kernighan/?sort=old
Thursday, April 3, 2008
Tools
We study the evolution of software systems with the help of tools. By doing this we obtain a better understanding of the domain, and we reuse that knowledge in improving the tools themselves. Having better tools we can learn new facts about the systems we study.
Eventually the users of the tool will also benefit from the tools which now are much more powerful.
Sunday, March 23, 2008
Top-Down Software Exploration
Another top-down approach is the Software Reflexion Model by Murphy, Notkin,
and Sullivan (1995). The Software Reflexion Model is to capture and exploit the
differences that exist between the source code organization and the designer’s
mental model of the high-level system organization. An engineer defines a high-
level model of the structure of the system and specifies how the model maps to
the source. A tool then computes a software reflexion model that shows where the
engineer’s high-level model agrees with and where it differs from a model of the
source. The primary purpose of this technique is to streamline the amount of time
it takes for someone unfamiliar with the system to understand its source code
structure. (from koshcke-thesis)
__
One reason for top-down exploration is supporting supervision. The user needs to
see "his" system evolve.
Friday, March 21, 2008
Related Work on Module Depdendencies
http://www.eclipsezone.com/articles/lattix-dsm/ - matrix representation of dependencies. Implemented and integrated in Eclipse. By Lattix inc.
Tuesday, March 18, 2008
inherit and reuse...
From OAuth documentation: "While we wanted the best protocol we could design, we also wanted one that people would use and that would be compatible with existing authentication methods, inherit from existing RFCs and reuse web standards wherever possible. In this way, we ended up leaving a lot out of the spec that seemed interesting but ultimately didn’t belong."
Monday, March 17, 2008
[micro-survey] metamodels of hierarchical exploration tools
what are the metamodels that people use when doing hierarchical graph analysis of software systems.
shrimp - the metamodel is a generalized nested graph. in "manipulating and documenting software structures" (stor-manip) storey discusses the introduction o fhte nested graphs formalism by d. harel.
(harel-visform)
Nested graphs, in addition to nodes and
arcs, contain composite nodes which are used for denoting
set inclusion. The containment or nesting feature
of composite nodes implicitly cornmunicates the
parent-child relakionships in a hierarchy. In SHriMP a
non-leaf node is open when its children are visible and
closed when its (children are hidden from view.
[...]
Composite nodes correslpond to subsystems in the software.
Composite arcs represent a collection of dependencies.
Composite nodes may contain other composite nodes
and arcs as well as atomic nodes and arcs. This nesting
feature of nodes communicates the hierarchical structure
of the software (e.g. subsystem or class hierarchies)
rigi
rigi's model seems to be pretty cool and general.
The Rigi model is a conceptual model for the representation
and organization of the “bricks” and “mortar” of complex
software systems. It is best characterized as a special purpose
semantic network (graph) data The semantics of the
model are a careful definition of the meaning and usage of the
nodes and arcs. The nodes and arcs of the graph represent the
components of a software system and their dependencies. For
example, a node may represent a subsystem, an interface, a variant,
a revision, a specification, a data set, or a picture; an arc may
represent a change, compilation, binding, revision, or aggregation
dependence.
Bauhaus
really impressive piece of engineering.
Information extracted from the source code is represented
in a resource graph (RG) [2], which abstracts from
a particular source language by only representing global
information such as call, type, and use relations.
Examples of relationships range from information
that can be directly extracted from the source code (e.g.,
function calls) to more abstract concepts (e.g., communication
between a client and a server)
The resulting graph can then be visualized and manipulated
in Bauhaus with our extension of the graph editor Rigi
(from the xfig paper)
micro-survey which may be related: is interaction modes. do the composite nodes need to be mapped on existing programming
language concepts such as packages and modules and difrectories or they can be anything?
Sunday, March 16, 2008
...
what is the one thing that in this hour/day/week/month is the most important thing to do to move forward?
wolfmaier's lessons
(wolf-lessons)
finished reading wolfmaier's paper. pretty cool stuff. the guys talk from experience.
- effort of analyzing a system is abour 10-12 man days.
- high level analysis is applied. critical metrics are detected. expected metrics values.
- good source of heuristics that can be mapped on top of the softwarenaut view.
- absolutely necesary to involve the architects
- sotograph - seems to have been useful for them.
- generated code has to be treated differently
19.07 - 19.28
[micro-survey] Clustered Decomposition Evolution
Did anybody look at the evolution of the clustered decompositions of the system?
This would be an interesting study in itself. If hard.
But why? And who cares? And what problems can be solved by doing this?

