I'm starting a weekly feature called "Kelsey on California" where I'll share by musings and diagnoses of life on the left coast.
I'm still not sure how I feel about living in California. Not because I don't like it here, but because it's "California". I guess I've always had a pretty stereotypical bias towards this state and the people in it. As if everything here is basically what you see on "Real Housewives of Orange County". I feel very weird and uncomfortable associating myself with "that". I feel awkward saying "I live in California" out loud. I understand that Northern California is VERY different from Southern California, but still, it's California. I mean, there ARE palm trees here. So right away its glamorous to me, which makes me feel the need to wear high heels and dangly earrings even while running to the grocery store, or just running for that matter. I have been wearing makeup every day, but I think that's just because I'm bored since I'm unemployed at the moment (hopefully only for another week), and I did buy a new pair of dangly earrings. But, in general, I don't see myself changing a whole lot. At least I guarantee I wont be dying my hair and getting a boob job.