[ale] semi [OT] making learning ruby programming fun?

Ron Frazier (ALE) atllinuxenthinfo at techstarship.com
Mon Mar 25 22:38:53 EDT 2013



James Sumners <james.sumners at gmail.com> wrote:

>On Mon, Mar 25, 2013 at 6:48 PM, Ron Frazier (ALE)
><atllinuxenthinfo at techstarship.com> wrote:
>> Actually, in the 21st century, I believe the programmer should NEVER
>have to think about things like memory allocation ... EVER!
>
>There's no chance you're going to see eye-to-eye with pretty much any
>programmer that reads this list. [1] is basically about the dangers of
>glossing over these sorts of topics (i.e. not learning them/using
>them).
>
>Going back to the original topic, there is only one way learning a
>language is going to be fun -- you have to like the language. For me,
>there's no book ever written, or will be written, that will make
>learning Ruby fun. I hate the language and its conventions. So, Ruby
>appeals to you at a basic level, just pick a project and start working
>on it. That's the only real way to learn it.
>
>N.B. when I say "learning a language" I mean the standard library
>and/or framework(s). Learning the syntax and constructs is easy.
>
>[1] --
>http://www.joelonsoftware.com/articles/ThePerilsofJavaSchools.html
>
>
>-- 
>James Sumners
>http://james.roomfullofmirrors.com/
>
>"All governments suffer a recurring problem: Power attracts
>pathological personalities. It is not that power corrupts but that it
>is magnetic to the corruptible. Such people have a tendency to become
>drunk on violence, a condition to which they are quickly addicted."
>
>Missionaria Protectiva, Text QIV (decto)
>CH:D 59

Hi James,

I certainly didn't mean any offense by my statements and I'm sorry you hate Ruby.  I'll admit my experience is limited and some of my points of view might be wrong, or obsolete.  I guess the only way to really know if I love or hate Ruby is to work with it for a while.

I appreciate the link you sent and will look at it later.

My main point about memory allocation is this.  I've programmed a number of high level languages in my life, mostly in academic settings.  Those include basic, pascal, and delphi.  I spent several years programming clipper for Delta Air Lines.  None of these languages even HAD commands for tinkering with memory, as far as I know.  I just built the program and the compiler / linker / interpreter figured out what to do with the variables I used.

I have (had) the most knowledge of clipper, so I'll comment about it.  As a programmer, I just opened the dbase style table on the LAN server and went about searching it or adding records to it.  No doubt, the database engine (which was likely written in C) had to worry about memory allocation and had to figure out where it would stuff the data from the table that I wanted to access.  But I, as an applications programmer, did not.  I just manipulated the file, and it just worked.

What I did have to worry about was variable scope.  The way clipper worked, the variables had a scope of whatever element they were declared in.  There were a couple of variations on the way you could declare them.  But, essentially, if I declared a variable in the main startup function, it would live as long as the program was running.  This was very handy for things that any function might need, such as status indicators, login credentials, system time, etc.  On the other hand, variables declared in a subroutine were visible only to that routine.  If I created something named UserName in a report printing routine, once that routine exited, that variable was no longer accessible.  I did create some data warehousing routines with static variables which were persistent, in case one subroutine needed to save some data that other routines would have access to.  The data warehousing module I built had function calls which allowed any routine in the program to add, change, and delete name value pairs to / from this internal database.  It was kind of like an internal temporary registry.

Now, I'll admit I never used recursive function calls, which could get ugly memory wise.  But, as I said, the application language never even gave me commands to tinker with memory.  So I never did.  I presume the compiler and linker figured out how and when I would need to allocate memory and deallocate it based on the variable allocation and scoping.

For the most part, I don't see why that isn't just the way things are done.

Sincerely,

Ron



--

Sent from my Android Acer A500 tablet with bluetooth keyboard and K-9 Mail.
Please excuse my potential brevity if I'm typing on the touch screen.

(PS - If you email me and don't get a quick response, you might want to
call on the phone.  I get about 300 emails per day from alternate energy
mailing lists and such.  I don't always see new email messages very quickly.)

Ron Frazier
770-205-9422 (O)   Leave a message.
linuxdude AT techstarship.com




More information about the Ale mailing list