Syzygy

Tuesday, February 7, 2012

Why I hate programming (part 4 of n)

back to R:

1. Writing some code, seemed a bit slow, looked to stack overflow for help:
"First of all, for anyone who hasn't seen this yet, I strongly recommend reading this article on the r-wiki about code optimization."

status of link: dead
reason: R's wiki page is dead

2. Populating a matrix from a vector. (In R, a matrix is different from an array is different from a vector.)

vec <- 1:9
mat <- as.matrix(vec, nrow = 3, ncol = 3)
# mat is 9x1 (silent fail on invalid arguments nrow and ncol)
mat2 <- matrix(vec, nrow = 3, ncol = 3)
# mat2 is 3x3

3. How big is my vector?

vec <- 1:9
mat2 <- as.matrix(vec, nrow = 3, ncol = 3)
nrow(vec) # NULL
nrow(mat2) # 3
NROW(vec) # 3
NROW(mat2) # 3

Labels: , , ,

Wednesday, March 2, 2011

bibtex & capitalization

It seems that most of the default bibtex styles convert all non-first letters in titles to lowercase. This is problematic, since of course many papers have proper nouns (e.g. locations, species, named concepts) in their titles! Rather than go through and attach braces "{}" around all letters that need capitalization preserved, it makes more sense to edit the bibtex style file (*.bst) to preserve the title:
If you prefer to edit the bibtex style (.bst) rather than the bibliography (.bib), you can search for occurences of change.case$ in it. This is the function that capitalizes or title-izes fields that are not people names.

Typically, for the title field, you should find something like title "t" change.case$. Since you want the title unmodified, replace that by just title.

All I needed to do was copy the bibtex style file into the local folder with my tex document, edit it, and re-compile. Voila!

From here.

Labels: , , ,

Tuesday, February 22, 2011

Streamlining the workflow

Yes, TexShop's keyboard-command typesetting is nice, but I couldn't figure out a way to get it to run through the whole 4-step process (latex/bibtex/latex/latex) for making a file with cited references with one keystroke.

Lyx will do the whole compilation in one command, but its interface doesn't allow me to edit just the latex code, instead forcing me to use some weird WYSIWYM viewing mode.

Luckily, I found some scripts that will integrate MacTex and TextWrangler.

I still needed a single keystroke full-compile-and-view command though, so I glued the "Full document compilation" and "Show result" scripts together and assigned it a keyboard shortcut (Window->Palettes->Scripts->Set Key...).

I also needed to edit the shell script to use Preview instead of Skim. Maybe this means I should play around with Skim a bit?

Anyway, this now allows me to select a bunch (or all) of references in Papers, export to bibtex, and cite them easily (assuming that the metadata in Papers is correct).

Labels: , ,

Thursday, January 27, 2011

Why I hate programming (part 3 of n)

1. Matlab's inability to remember the current folder. Sure, you can set it so that the default location is a different location, but would it be so hard to remember the current folder when quitting and to automatically open that location when started again. Seems like a no-brainer to have some sort of checkbox for this in the preferences...

2. Matlab's default renderer's inability to handle transparency. Yes, that's right, I want to plot two histograms on top of each other, but the only way it can do transparency is to enable the OpenGL renderer. This would be fine if the OpenGL renderer could do fonts properly, but it can't, and moreover is unable to save images in vector format. (Don't even get me started on trying to draw patterned bars...)

more to come...

Labels: , ,

Tuesday, October 12, 2010

Why I hate programming (part 2 of n)

The difference between using "=" and "<-" for assignment in R:

It seems that, historically, "=" was not allowed for variable assignment. My understanding is that in modern R, using "=" for assignment is (mostly) equivalent to using "<-", so that:

x = 2 % This line is the same as
x <- 2 % this line.

However, one should be aware that when using "=" to give parameters for functions, assignments do not occur (in the global workspace):

x <- rep(2, times=5) % "times" does not have value 5
times % gives an error
x <- rep(2, times<-5) % "times" IS set to have value 5
times % this will have value 5


There's also some weird differences with how "<-" and "=" get interpreted (maybe has to do with order of operations / syntactic sugar?):

x = y = 5 % both "x" and "y" will have value 5
x <- y <- 5 % both "x" and "y" will have value 5
x = y <- 5 % both "x" and "y" will have value 5
x <- y = 5 % gives an error


This is what R's documentation has to say about this nonsense:
"The operators <- and = assign into the environment in which they are evaluated. The operator <- can be used anywhere, whereas the operator = is only allowed at the top level (e.g., in the complete expression typed at the command prompt) or as one of the subexpressions in a braced list of expressions."

Labels: ,

Friday, April 30, 2010

Why I won't be buying new desktop parts

Based on my normal computer cycle, I should have upgraded my desktop in late 2008, early 2009, but haven't. As a computer approaching 5 years old, my desktop is aged, if not obsolete. There are some things that it chokes on, mainly any computer game released after 2005, and playing back certain poorly-compressed HD video. Looking at the parts I would buy to make it new and fancy again, they are relatively cheap (in computer terms): ~ $600 for new motherboard, processor, and RAM. I could probably make do with spending half as much if I were planning to replace it sooner (i.e. in 3 years instead of another 5).

Still, my computer is fine as-is (i.e. until something breaks), I don't need a slightly shinier and fancier new monitor, and $600 (or even $300) goes a long way towards buying more Legos. :)

Labels: , ,

Thursday, April 8, 2010

The return of touchscreen?

Remember back in 2002, when Microsoft had these big plans for tablet PCs, with a whole separate version of XP?

Yeah, you probably don't, but back then I had this wonderful pipe dream for a slick note-taking application. First, it would obviously need to be able to open and write a variety of file formats, including plaintext, richtext, MS word, PDF, and PPT. Being able to annotate the last two formats would be especially wonderful for marking up papers and lecture slides. Couple this software with handwriting recognition, a computer-algebraic system, and graphing capabilities, and math nerds would eat it up. Imagine being able to note down equations, converting them into symbolic entities, manipulating and solving them in software. As a tool for doing math homework, it could be invaluable.

Separate pieces of software would probably be likely to simplify things, but if they were all made by the same company and shared UI structure...

Labels: , ,

Sunday, December 6, 2009

Q: How hard can copy-paste be?

A: as overly complicated as anything else that comes out of the Mac BU of Microsoft. Hey Microsoft, stop playing around with multi-touch in Windows 7 and get some engineers working on very basic aspects of your OS and major software. (You'd think they'd have a better business strategy than developing features for 0.01% of their users when features used by 90% of the users don't work properly. But that's what you get when you have a de-facto monopoly on the industry. I mean, people still use Powerpoint instead of Keynote!)

Well, through trial and error, here is what I have discovered. Let's say you have some data. You plot it in excel and format it all nice and pretty. Then you copy-paste it into Powerpoint for your talk at an upcoming conference. Uh-oh, it doesn't work! Turns out, you need to go back, and save that spreadsheet in .xls format (previous version format). Then you've got to copy and paste-special as a Microsoft Excel Chart Object. Now your format is screwed and your chart looks like shit. No worries, double-click it, tell it to convert, wait a ridiculously long 5 seconds, and now you can edit your chart directly in Powerpoint.

To review:
1) you can't copy and paste from .xlsx to .pptx format directly.
2) when copy and pasting from .xls to .pptx, you need to paste-special (Microsoft Excel Chart Object)
3) the pasted chart needs to be converted before you can format it and for it not to look like crap

I had similar problems copy-pasting from pdf. Solution? save as png and then paste it in.

Labels: , ,

Wednesday, November 11, 2009

Why I hate MS more and more...

I used to be fairly positive towards Microsoft in the past: sure, OS X is a much nicer experience, indie applications for the Mac are polished to a much higher degree than on Windows, and the free (!) IDE in XCode is quite good, but Windows has to deal with backwards-compatibility, it is *the* platform for PC gaming, and sometimes Microsoft Research churns out some cool stuff.

Still, I find myself more and more annoyed by Microsoft's UI design sense (or lack thereof). Perhaps I have merely become spoiled by living on a Mac or become more observant due to reading Siracusa's rants (exhibit A, exhibit B). Yes, Apple doesn't listen to its own UI guidelines - everyone (well, anyone in the "know") admits that - but sometimes this gives us good things. (And sometimes, horrible abominations), but at least it has the good sense to make sure at least one engineer brings key interfaces up to new UI standards. (Yes, I'm talking about the control panel UI crap shown here.)

I feel like Microsoft has become a company with no guiding vision - some people work on cool things, some people work on the behemoths known as Windows and Office, but there's no one there with the bullwhip making sure things are consistent. Just look at the Office UI - for something that is the de facto office productivity suite, you'd think they wouldn't just up and change the interface on us. (but that's what Office 2007 did). And then when they released the next update for OS X, you'd think they'd fix things or add functionality, but instead they removed VBA scripting. And if the Ribbon is such a GREAT UI idea, why isn't it in Office 2008? Yes, I hate the Ribbon, but it's super-annoying when I'll work on something at school, transfer the file to home, and suddenly wonder why everything behaves differently. Formatting titles in Excel charts used to be so easy! I do like the fact that Office 2008 UI behaves more traditionally, but what I don't like is the formatting palette that is clearly an Inspector Tool wannabe. Do the people at the Mac BU not know how to make OS X native apps, do they just don't care, or are they hideously understaffed? (maybe all 3?)

One would think that Office 2008 would run faster than Office 2004 on an Intel Mac (because 2008 is a Universal Binary and 2004 is PPC-only and requires Rosetta). Nevertheless, I find that Office 2007 running on emulated Windows XP using half the memory and one core still runs rings around both "native" Office versions. (at least as far as computation in Excel is concerned.) Any version of Excel still seems to be faster than the Numbers app in iWork, though...

I already use Keynote for presentations, and Pages / LaTeX for word processing - can someone please make me a good/fast spreadsheet app so that I can put Office out of its misery?

Labels: , ,

Tuesday, October 20, 2009

The quest for resolution independence

I sent this in an e-mail to a friend who was complaining about the lack of high DPI (dots per inch) consumer-grade LCD desktop displays. (Some models do exist, but are intended for the medical community and are pricey.)

A comparison of DPI for previous/current Apple computers and display products

Laptops:
13.3" (1280 x 800) = 113.49 DPI
15.4" (1440 x 900) = 110.27 DPI
17" (1920 x 1200) = 133.19 DPI

Cinema displays and old iMacs:
23" (1920 x 1200) = 98.44 DPI
24" (1920 x 1200) = 94.34 DPI
30" (2560 x 1600) = 100.63 DPI

New iMacs: (note that these are now 16x9, suitable for watching "widescreen" video without black bars instead of the 16x10, which is much more common for widescreen computer displays)

21.5" (1920 x 1080) = 102.46 DPI
27" (2560 x 1440) = 108.79 DPI

And while Apple has touted a push for resolution independence (along with 64-bit) for a while now, some things still appear to be broken (at least in the first Snow Leopard release. I haven't installed Snow Leopard yet, so I can't say if it's been fixed since then.):

http://arstechnica.com/apple/reviews/2009/08/mac-os-x-10-6.ars/21
(scroll down to the Resolution Independence section)

On a further note, I do have minor gripes about the 16x10 computer displays, since my current HP display scales up widescreen input (via component) to the full size, so video games are stretched vertically ever so much (+11.1%). I believe this is simply because component is analog, and is being decoded by an onboard chip that then gets sent into the analog to digital converter (probably the same one that would decode a VGA signal). Not sure if this is still an issue on the newer LCD displays from HP and Dell that take consumer digital inputs like HDMI. (not that I have a PS3 or 360 to test anyway) I imagine it's still an issue with component video in. On the other hand having 16x10 IS useful for watching 16x9 video, because the black bars allow for UI popup that does not obscure the video at all.

Labels: , , ,

Sunday, September 6, 2009

clever spam

For the quick comic intro, see here.

In the latest round of human-blogs vs. spambots, I received this "interesting" comment on an oldish post. Since it's rather long, I'm only going to excerpt a particularly funny bit:

"School officials in Democratic-leaning New England say they have received relatively few charm bracelets." (with "charm bracelets" linking to what I assume is a jewelry site, but which probably sells viagra, too.)

Basically, it seems as though it's pulling random sentences from news clippings and then performing some basic parsing to replace certain phrases with its own links to create sometimes-grammatically-correct-but-always-humorous sentences.

Labels: ,

Tuesday, September 1, 2009

The Snow Leopard Cometh

Well, Snow Leopard is out, and I've read the review. Here's my brief rundown:

Pros:
- performance improvements (resolving that dreaded kernel_task/CPU/overheating issue somewhat, I hope)
- decreased size (freeing up a few GBs for my paltry 80GB hard drive)
- gamma 2.2 (so I don't have to worry about color differences when my website is viewed on Mac vs. PC)

Cons:
- 64-bit apps (breaking the widescreen hack to Mail and the SIMBL-based color hack to Terminal) [these are not insignificant UI fixes that would not be particularly difficult for Apple to implement natively...]

Things I'm excited about that don't affect me at all:
- XCode 3.2 & Clang (cuz who doesn't like a compiler with a metallic silver dragon logo that doesn't support the programming language you use)
- QuickTime X (cuz hardware-based H.264 acceleration is nice IF your graphics card/chipset is supported)

Labels: , ,

Wednesday, August 12, 2009

the future of SSD drives

After reading about half of Accelerando and thinking about the nature of technology, I got to wondering about what was going on in the field of SSD development. Last I read, one of the major factors limiting SSD speed had to do with differences between read and write speed as well as write-leveling algorithms to ensure a drive-life as long as possible. However, in thinking about this problem of optimal use of SSD given their physical limitations, I got to remembering that other innovation in low-level data storage: separation of the physical drive from the storage methodology (in the form of zfs). Before I read up on it, I was confused about what zfs really brought to the table beyond traditional filesystems. (And the answer for most consumers is, none.) However, it does present some major advantages for those running large servers. (hence why it is a feature relegated to the server version of Snow Leopard) So now the question becomes, when will we get a filesystem (or operating system) specifically designed to not collide with the physical limitations of an SSD? Right now, Apple is in love with many small files, which makes incremental backups feasible and easy, but runs counter to effective SSD management.

I realize that at the moment, standard magnetic recording is still used in 99%+ of the market, but presumably people are going to realize that having TBs of storage space is not going to be useful when media creation has not drastically increased (nor pipe bandwidth). And if SSD development continues, we should soon see the advantages of dramatically faster / slightly more expensive data storage. After all, to the average consumer, the 1 TB (traditional) has very little marginal value over the 256 GB (SSD).

Labels: ,

Sunday, July 5, 2009

Now I know what to name my colonies...

About a year ago, I first became enamored with Galactic Civilizations II, a sequel to a game I'd never heard of, Galactic Civilizations (I). Turns out, this game is made by Stardock, who are much better known for their desktop enhancement software. Of course, this interest was spurred by reading PC Gamer's wonderful blog entries on the two expansions to Galciv II, found here and here. Regardless, my itch to conquer space in a turn-based game that looked to be detailed enough to satisfy my OCD-ness, yet simple enough to satisfy my impatience went unattended for over a year. (Partly because I realized how much of a time suck Galciv II was going to be.)

Fast-forward, and suddenly I was in Gamestop, trading in a bunch of games I wasn't going to play again so that I could replace a broken PS2 controller (to finish God of War 2 and feed that Katamari craving) when I saw Galciv II on the shelf. Unfortunately, it was merely the older expansion, but the guy behind the counter was glad to go into the back room and retrieve ... the Ultimate Edition, which includes both expansions and the original for forty bucks! (I guess there's a soundtrack cd too, but the soundtrack is not really designed for standalone listening.)

As of now, I've only completed a few of the missions and two games against the AI, one on a medium-sized map (5x5), and one on a large-sized map (8x8). FYI, there are three more size classes above large, which are Huge (12x12), Gigantic (18x18), and Immense (21x21). Needless to say, maps of that size would require a couple weeks blocked off to play, given that the large-sized map took me about 16 hours to complete...

So far, though, my experience has been excellent. I've only tried AI up to the normal difficulty, which hasn't been too much of a challenge so far. My only complaints are that there are some bugs (1 or 2 crashes to Windows and weird disappearance of the next turn button), but the autosave function resolves the former and save and loading the game resolves the latter. The lack of documentation is a more serious issue, however, particularly in the lack of clarity in game mechanics. For instance, when you research certain technologies, sometimes your stats get a boost, except that it's expressed as a raw integer (e.g. diplomacy +10), yet many of your stats are displayed as percentages. When you build certain social projects on your planets, some of them also yield boosts (e.g. diplomacy +25%), yet it's unclear whether that is an actual +25% to your diplomacy (including previous upgrades), or along the same lines as the +10 from a researched technology. Finally, sometimes, the difference between whether a social project's effects apply just to the planet or to the civilization are unclear, especially for things that boost morale.

But, it needs to be mentioned that the happiest time I had when playing was on one of the earlier missions, when I noticed the planets named Celes, Locke, and Sabin. Ahh, nostalgia. Luckily, there are 14 different FF6 characters, which will give me plenty of colony names for my first push among the stars.

Labels: ,

Wednesday, July 1, 2009

wonderful security lesson from UCSD

From UCSD (summarized by me):

How to use the UCSD encrypted wireless network:

1. download this file from our website
[note: safari wouldn't let me save it as it was, forcing me to change the extension when saving, and then changing the extension back after it was downloaded]

2. double-click the icon

3. if it asks you for your password, enter it in and click ok

[...]

Yes, I was told to download a file, open it, and enter in my computer's password. Hmmm, if I didn't know what was actually going on, this would set off all kinds of warning bells. It is so nice that UCSD neglects to explain what it is I am doing and why I should click the "always trust" button when "this root certificate is not trusted", because any potential scammer/botnet creator/hacker/identity theft is sure to explain the mechanics behind why a root certificate is not verified and why entering in my password is ok.

Labels: , ,

Saturday, February 14, 2009

DRM comments

Here is the comment I sent to the FTC for their upcoming workshop on DRM:

I share the opinion of several others that there are aspects of the Digital Millenium Copyright Act (DMCA) that are particularly disruptive for consumers such as myself. Specifically, making the bypassing of Digital Rights Management (DRM) illegal is restrictive towards the needs of certain users. I built my desktop computer with a high-end monitor, and surround-sound speakers. In the interest of playing blu-ray high-definition (HD) movies, I began considering the purchase of a blu-ray drive to install in my computer. However, upon further research, I realized that playing back blu-ray movies would not be so simple. Because of High-bandwidth Digital Content Protection (HDCP), a form of DRM on blu-ray discs, I would need to purchase a new video card that supports HDCP, a new monitor that supports HDCP, a new sound card that supports HDCP, a new receiver, in addition to software and "upgrading" to Windows Vista. Similarly, if I were to purchase a consumer blu-ray player (such as a PS3 or other device) I would need to purchase a new monitor and receiver to view/hear HD content. Needless to say, I was disheartened. Alternatively, if I downloaded blu-ray movies from the internet, there would be no such restrictions and I would be able to play HD content without

I fully support the entertainment industry by purchasing content legally. My personal feeling is that artists, writers, producers, etc. should be rightfully rewarded for their efforts. However, I do not like being forced to purchase hardware because of these restrictions. In effect, I am being punished for trying to play HD content the ONLY legal way. In addition to downloading content (a copyright violation) I could also use software to “rip” HD content to my computer for playback without needing a new video card / monitor / sound card / etc. However, under the DMCA, this manner of bypassing DRM is illegal.

As many have pointed out and will continue to point out, DRM is ineffective: it restricts users such as myself from enjoying the full freedoms of legally purchased content that are enjoyed by those who obtain such content illegally. As noted by security experts, DRM will always be imperfect: there will always be people who will be able to hack/crack/break the encryption and make the content freely available on the internet to download. DRM only creates shackles for legitimate users.

Furthermore, I would like to point out that this issue has been present for some time. DVD's which have CSS, a form of DRM, require a player that is capable of decrypting the content. However, such players, to my knowledge, were never legally available for users who run Linux operating systems. As such, a program, DeCSS was created in 1999 that bypasses this form of DRM and is illegal under the DMCA. The Motion Picture Association of America (spec. its former president, Jack Valenti) had promised to create legal DVD player software for Linux that would enable users to view DVD's with CSS encryption. However, to my knowledge, they have failed to follow through on this: thus, users who wish to play CSS-encrypted DVD's on a Linux computer can only use illegal tools to bypass the DRM

Industry CANNOT be trusted to follow through on their "promises" to facilitate use of legally purchased content for consumers and end-users. The only option for individuals, then, is to bypass DRM illegally, download content illegally (copyright violation), or forgo such content. The primary purpose of government is to protect the rights of individuals. Thus, the FTC should regulate the ability of industry to abuse DRM: creating additional exceptions to the DMCA for individuals to bypass DRM to enjoy content legally is a vital action to protect individual rights and freedoms.

Labels: , ,

Saturday, February 7, 2009

LCD Monitors

I was sitting in lab the other day, staring at my monitor, wondering why it appeared so twinkly. After all, it was an Apple Cinema Display (aluminum-frame) 23", very similar to the monitor I have at home (HP L2335 23"), but for some reason, it was annoying the heck out of me.

Curious, I opened my laptop to look at its screen, and the display was not sparkly at all, although it was quite glossy, true to form. So, arriving home that day, I took to looking up LCD monitor information. I was, of course, interested in the possibility of buying a cheap second monitor for myself. Through some digging, I was not surprised to find that those large, cheap LCD screens mostly use TN panel technology, just like laptop panels. And if you've seen laptop panels at a bit of a vertical or horizontal angle, then you've seen the primary bad quality about a TN panel, which is really shitty viewing angle. Beyond that, color reproduction is also quite poor, although response time is the fastest. This wasn't anything new to me, but I was trying to figure out the panel type for Apple's LED cinema display, assuming that I will be able to get my hands on an DVI->displayport adapter or some updated Mac Mini that has a displayport output. This website, which keeps a comprehensive list of S-IPS and H-IPS monitors seems to indicate that Apple's monitors have always been S-IPS or H-IPS. Now confused about the issue of H-IPS vs. S-IPS, I found another site, which actually addressed my original question. The likely culprit is probably the anti-glare coating on the monitor that is causing the "twinkling".

Now, onto the amusing part of the whole issue, which is simply that Apple appears to have always used S-IPS and H-IPS panels, at least in their stand-alone monitors and the newer iMac's. Ironically, all the people who laugh at Apple fans who buy those products because they come with the "Apple Tax" are unaware of the display quality, esp. compared to some of the offerings by other manufacturers. (Certainly, the story of Dell pulling a bait-and switch with one of its monitors has been the source of a minor brouhaha in the past: the first rollout came with S-IPS [for reviews?] and then was switched over to S-PVA [for cheapness?].) All things considered, $800 (educational price) for a 24" LED-backlit H-IPS monitor is a pretty good deal. And for those more economically minded, the HP version without LED-backlighting is ~$600.

Oh, and HDCP, still hate its fucking guts.

Labels:

Tuesday, January 13, 2009

Why I hate programming

There are generally two approaches to programming: The first approach (which is highly recommended) involves having a mental model of what the code will do, for which the only bugs that appear will be mistakes in typing, or algorithmic errors. These two types of errors are fairly easily distinguished, especially since typing mistakes usually pop up as errors during compiling.

The second approach involves writing code in very short sections at a time, and then fixing any bugs that may appear during compiling or running. One of the main problems is simply that bugs may be due either because a bit of code didn't do what you wanted it to do OR your algorithm was incorrect, in which case major pieces of code need to be rewritten. Another problem is simply that this method is extremely tedious. It is analogous to solving algebraic equations through trial-and-error. Sure it works, even if you don't know algebra, but ultimately we end up teaching everyone algebra.

This analogy also gets at the heart of the difficulty with the first approach: in addition to requiring would-be programmers to learn a language's syntax, they must also have a good (i.e. accurate!) mental model of how the language works. Once this model has been internalized, developing code becomes simpler and straightforward.

The problem I have with certain programming languages is that sometimes they do some things that are non-intuitive to me. Consider the following bit of code in Matlab:

A = ones(3, 5);

which creates a matrix of 1's with 3 rows and 5 columns (yes, it's silly, but bear with me.) Now, consider a slight variation on the code:

B = ones(3);

My initial mental conversation for what this code does might go something like this: "well, in the first example, we gave two arguments and got a two-dimensional matrix, so it makes sense that in this example, with only one argument, we should probably get a one-dimensional matrix. Since the first argument in the first example dictated the number of rows, B should have 3 rows, and be a column vector."

"Hah," Matlab says, "you expected a column vector, but instead, I'm going to give you a square matrix, with 3 rows and 3 columns."

To which I respond, WTF. If I wanted a 3x3 matrix, I could have just as easily called ones(3, 3); BUT, for some reason, probably historical, "people" expect ones(3) to return a square matrix rather than a vector, so that's what Matlab is going to give you. If you actually DID want a vector, you would use ones(3, 1) or ones(1, 3) depending upon your preference for rows vs. columns.

I should note that Matlab does indeed support 3-dimensional matrices:

C = ones(3, 5, 4)

which is equivalent to 4 3x5 matrices stacked on top of each other. When you need two dimensions, you don't need to specify 1 as a third argument, but when you only need one dimension, you DO need to specify 1 as a second argument. Not only is nonintuitive (to me), but it now becomes inconsistent as well.

Now let's take a look at a beautiful R example I just saw today:

x <- 1:10
length(x) # returns 10
length(x) <- 20 # now extends the length of vector x to be 20 elements

First of all, I should mention that R traditionally uses <- as an assignment operator. The introduction mentions that "In most contexts the '=' operator can be used as [an] alternative." It does NOT say that they are equivalent (or why would it say "most contexts"), but it also fails to mention cases where <- and = might work differently. To this, I am left puzzled.

The problem I have with this example is the third line. In the second line, length(x) returned 10, which we can guess intuitively returns the length of the vector x. However, the notation length(x) indicates that length() is a function, NOT a parameter for an object. The usage of a function is fundamentally different from simple value assignment, such as:

y <- 20

In the latter example, y is a variable, and thus is the target of assignment rather than something to be evaluated. A function, on the other hand, is different, because it is, well, a function. Users can write their own functions, with appropriately specified return values. Note that the user-defined functions operate in ONE direction only: arguments are specified, some stuff is done, and sometimes, a value is returned. To have that value then be the target of an assignment completely boggles the mind. In addition, it cannot be done with all functions. For instance, the following code gives an error:

x <- 1:10
sum(x) # returns 55
sum(x) <- 20 # error:
Error in sum(x) <- 20 : could not find function "sum<-"

In fact, the error does tell us something about the internal model for R: the message about "sum<-" not existing suggests that what is actually going on in the first example is that there are two DIFFERENT functions called length, and that "length(x) =" is actually syntactic sugar for some other function called "length<-".

The very nature of syntactic sugar should be to make a language easier to learn/type. However, in this case, it has only made me more confused...

Labels: , ,

Friday, August 29, 2008

The HD dilemma

I have a nice computer. I have a nice monitor (23" LCD hp 2335, 1920x1200). I have a nice audio setup (Sondigo Inferno w/ optical out to Onkyo 5.1 speaker system). I want to watch blu-ray movies.

Simple, I thought, I'll just buy a blu-ray drive. I wonder how much they cost. Hey look, I can get a Pioneer blu-ray drive that also burns DVD's for $160. Great!

Oh, but I guess I need software to play blu-ray disks. Hey, there's an even cheaper Asus drive that's a retail version with software. Oh wait, that software is crap and only does stereo: that's retarded.

Oh, and my setup isn't HDCP. Wait, WTF? I need a new video card, a new monitor, and it looks like the software to decode and play might not even be XP compatible? That is RIDICULOUS. THANKS A LOT, FCC. WAY TO CAVE IN TO HOLLYWOOD INTERESTS.

Here's the problem: there are three classes of people, only one of whom actually gets screwed over by this HDCP/DRM nonsense:

1: the uploaders/pirates: DRM isn't going to stop them. AnyDVD HD is available for relatively cheap and will do the job.

2: the downloaders: DRM already removed and files uploaded by pirates, so HDCP setup is not needed, just a sufficiently fast computer and software that isn't restrictive like the commercial Blu-ray playing software. XMBC apparently will do the job just fine, even on computers with anemic video cards.

3: honest consumers: willing to buy blu-ray drive, blu-ray disks, even reasonably-priced software to play back the movies. (even after using various free software to play back plain vanilla DVD discs) Not willing to upgrade to Vista, pay $100 to PLAY BACK A MOVIE, buy a new GFX card, and a new monitor.

So, you ask, what's the problem? Just use AnyDVD HD to rip a blu-ray to your hard drive and use XMBC to play it back. Sure, except for a couple of things:

1) I shouldn't have to give up 20+ gigs and ripping time to play back a movie I own when I have hardware that is capable of playing it.

2) Oh yeah, it's illegal thanks to the DMCA. Thanks a lot Congress.

PS: filed under TV as well, cuz of Blu-Ray Firefly. Mmm, naked Nathan Fillion Morena Baccarin.

Labels: , , ,

Friday, March 21, 2008

Satyagraha

According to TSA Bob, the x-ray of my macbook air is "sensitive security information".

Well, this is my big chance to leak "sensitive security information"; here you go: 







note: I actually took this using photo booth's x-ray effect.

Labels: ,