By Lauren Carl ‘19
Although the Close was quiet early on Saturday morning, the Gray Library was buzzing with activity from the NCS Hackathon. Three years ago, GWC took inspiration from a Hackathon the club attended called Technica and decided to host its own Hackathon for NCS. In the tech-world, a Hackathon is an event that brings people together to solve real world problems in a short period of time using technology. Through utilizing this structure of a Hackathon, GWC aimed to teach NCS students of the real-world applications of STEM subjects and hoped to spark an interest in these various fields. This year, the Hackathon had more subjects than ever, such as programming, robotics, design, cryptography, and Virtual Reality, among tons of other activities (including a activities run by the Wii Club and NCS Alpha Eagles Team). The day kicked off around 10am with a guest speaker. Roisin McLoughlin, a senior at Georgetown University, spoke about her experience at VHacks, a Hackathon at the Vatican, where her and her teammates created an app called “Credit/Ability” to help refugees find long-term housing. After hearing from Roisin, the Hackathon festivities began. In the Hackathon, Hackers are given three “lesson” times and fill those three hours with different classes to explore new disciplines. Hackers could find themselves decrypting messages in cryptography, playing robot soccer, learning how to code through java and scratch, or taking part in one of the other exciting activities. The day also included two “build” challenges, a Rube Goldberg machine and a cardboard fort challenge, to get students on their feet and working in teams, demonstrating that technology doesn’t have to be just individual. The turn-out for this year was bigger than ever, consisting of mostly middle school students, with upper school students using their STEM spark to teach others. By the end of the day, Hackers were able to branch out to new subjects and hopefully find their passion in the STEM subjects. Seeing so much excitement and energy in one room was truly inspirational, making everyone more confident in their abilities and the abilities of women in STEM. GWC can’t wait to see what future Hackathons hold in store, and we are confident in the ability of NCS students to do amazing things in STEM.
0 Comments
By Charlotte Rediker ‘22
NCS freshman technology training sessions are designed to teach students about the opportunities and dangers of new technologies. While necessary and useful, they could be improved. Instead of relearning important, but sometimes redundant lessons from previous years, a better option would be to include discussions led by older students. Students would be more likely to enthusiastically embrace this new structure, whose content would also potentially be more engaging and practical. This year, at our first training session on Office 365, I heard many of my classmates audibly groan when told that we would learn how to send a sample email and organize files. While teaching these tasks is helpful for those without previous relevant experience, for the bulk of the grade, it provided little new information, and many students were not fully attentive, anxious to get on with their busy days. While these sessions definitely helped those students new to Office 365, I have a suggestion in regard to those who already know these basics. Perhaps NCS could make the full introduction to Office 365 and email usage mandatory only for students who are unfamiliar with it, while keeping it optional for those who would like a refresher course. This would free up time for the rest to learn about other, new technologies. Our next training session, on the appropriate use of social media, was better received than the first session, but could still be improved. Instead of a checklist of tasks, it consisted of open-ended questions prompting discussions among students. We got more out of these student-to-student interactions than we did from hearing the undoubtedly important, but somewhat repetitive reminder about social media bullying and the use of caution in posting anything. Discussions were dominated by a few students, who largely repeated the same answers many of us had heard before, while much of the class waited patiently to be dismissed. One new approach to introducing students to technology at the beginning of each new school year could be for older NCS students to lead seminars and moderate discussions, perhaps in peer groups, allowing younger students to learn from those who have lived through the experiences and who are presumably familiar with the latest technologies and their benefits and risks. It is indisputable that, at NCS and around the world, technology is an integral part of our lives. We, therefore, must learn how to use it responsibly. But it might be more effective if students were more excited to attend these seminars, approaching them with the enthusiasm that the subject deserves. By Taliyah Emory-Muhammad ‘19
I’m Taliyah, and I’m a co-lighting designer for this year’s musical production, Titanic. This will be my eleventh show designing, and, as with every show, Titanic poses its own set of challenges. Thankfully, our theater department is one of the best in the country, as we have better lighting equipment than many colleges. Additionally, the Close certainly has the best theater faculty any designer could ask for. So I, along with my co-lighting designer Ian Chang, am well equipped to tackle Titanic. Co-designing is a challenge in and of itself. Ian and I have our artistic disagreements, but at the end of the day we make a good team. We think the lighting is going to be pretty cool, and we’ve been working hard to get it done. Before February started, we did what’s called “focus,” which is basically pointing the lights to the places they’re supposed to be lighting, and making sure they’re not lighting what they shouldn’t light. This is harder than it sounds. All of the lights are in a grid, and we can only access them via the scaffolds. Also, this set has some pretty awkward lighting angles (thanks Mac), so we’ve often run into the problem of lighting other lights in the grid, which makes ugly shadows on the walls. We’ve also needed to hang a good amount of special lighting units. We hung an LED strip to light a platform that stands pretty close to the grid. Lighting areas that are close to the grid is a pain, because actors’ heads are close to the lighting units, so they’re either being blasted with LED light or they’re not being lit enough due to other lighting units that are in the way. Focus takes about ten hours, and it’s the most laborious part of lighting design. After most of the focusing is done, we go into cueing, which takes place in the booth at the light board. Before this happens, we usually draft a synopsis of where cues should be, which we then use to write the cues. Ian is a lot faster at writing cues in the board than I am, so he did most of the initial cueing. It’s okay though, because my favorite part of lighting design is using light as a medium of art, which gets refined in cue revision and at rehearsals. I’m not going to spoil the show, but the major plot point in Titanic requires impactful lighting, so Ian and I have been working on ways to properly help execute it. We have hung multi-colored backlight, and we’re in the process of coding effects to inundate the stage with cool lighting. As this is my last mainstage show at NCS, I’m going to be sad when it’s all over. Titanic is an amazing musical, and I hope everyone comes out to see what we’ve all been working on. Isabella Houle ‘19
Although I don’t know the first thing about operating a light board and barely know how I would keep track of moving furniture and props, I have learned organizational skills that rival anything you’d find on a study blog, have moved very quickly under extreme pressure and time constraints, and have fielded many stressed-out questions from people demanding a lot from me. How have I gotten this wide and invaluable experience, you might ask? Costumes Crew. On Costumes, I work in the about-two-week stint of Pre-Tech, Tech Week, and the show weekends. Pre-Tech for us is all about figuring out the show. With Ms. Liberman’s help, we gauge the show and see what is needed. We learn information and see moments to which we will later become very close, like one actor’s thirty second quick change, or the vast amount of bow-ties that we need to keep careful track of. Mostly, it’s about familiarizing. We learn the characters’ and actors’ names if we don’t know them, we see who wears what, we pick up tunes and lyrics (sometimes intentionally, sometimes unintentionally). Then comes Tech Week. For the first few days, it’s sheer anarchy. We figure out what’s needed just by experiencing it. We figure out the best way to do a quick change, we help people with costumes that don’t fit or need adjustments. I often help everyone get their quick changes together and make sure the actors are set in their costumes, or else I man the headset, the greenroom’s communication to the booth and stage managers. My task always depends on the show and what everyone needs -- that’s what I like the most about costumes; it’s unpredictable, and I never know exactly what I’ll be doing to give support during a given rehearsal. And then come the shows themselves. Hopefully, by then, we’ve gotten the quick changes down pat and know our cues to bring whatever is needed at given points in the show. Now, it’s about expecting things. We must be at the ready to zip someone as they run up for their next song, or else be prepared to safety pin someone’s pants right as they finish changing. Everyone’s hyped up on energy. The enthusiasm is palpable, and it’s a feeling of camaraderie that I rarely get from anywhere else. Before I know it, it’s over. Just as I’ve gotten my changes set and learned the music, we’ve closed. But I do know, every time, that we’ve had a successful run and that I’ve contributed to something great. From everything I’ve learned to all the people I’ve met, I’ve loved my time on Costumes and can’t wait for Titanic’s round of shows to come. By Armon Lotfi ‘20
Since it was introduced in 2012, CRISPR has evolved tremendously, allowing humans to edit the genome of many earthly organisms, including our own. CRISPR technology allows individuals to edit certain segments of DNA. CRISPR has been used on animals, plants, and humans. The outcomes of these experiments demonstrate that the CRISPR technology is still in its developmental stages. Until scientists have foolproof evidence that the CRISPR mechanisms works on animals and plants with a high success rate, CRISPR should not be used in the human genome. Previous experiments on animals show that CRISPR is in its early stages of development. Some critics believe that edited animal DNA could create mutations that are harmful to human health when consumed by humans. Additionally, the breeding of these CRISPR animals could lead to the spread of unwanted mutations. Even though some experiments with animals have been partially successful, they are almost always accompanied with unwanted consequences. Results show that this technology is not close to being perfected; however, it is imperative that researchers continue to experiment on animals, paving the way for the use of CRISPR on the human genome. In 2018, He Jiankui, a Chinese scientist, used CRISPR to edit a gene which creates a protein that is responsible for H.I.V. Dr. He wanted to disable the gene in order to create H.I.V. resistance in the babies. He’s experiment is not only controversial because his experiment is not foolproof but also because it is unknown if he had informed the parents of the babies of his exact procedures. Experts predict that his experiment will slow innovative research in the CRISPR world. For now, CRISPR should not be used to edit the human genome until the technology has been (nearly) perfected on animals. We need to wait. By Will Holland ‘20
Have you ever been really excited to watch a video on a friend’s iPhone, but had that excitement turn to dismay when the footage was so shaky that its subjects were reduced to an indiscriminate blur? Well, I had, and far too many times. When I and my colleague, Will Nash, were preparing to make a movie for our Spanish class that would have a cast of over thirty people and end up being more than an hour long, I worried that our shots of actors in motion would look haphazard and unprofessional. However, this fear of mine became obsolete upon acquiring a DJI Osmo Pocket camera. For those who are unfamiliar with the name, DJI is a Chinese technology company specializing in the drone industry. Enthusiasts consider DJI’s drones among the best in the business while professional film makers regard them as essential to any production. Such high esteem was on display in 2017 when DJI won an Emmy® award for Technology and Engineering based on the company’s outstanding performance in a variety of high budget television programs and movies. I had known about DJI to a limited extent back in December, but I had completely failed to notice that their latest product, the Osmo Pocket, is not a drone at all. Rather, it is a camera similar built with the same stabilizing technology that is a feature of so many of DJI’s drones. When the individual holding it is on the move, the image remains completely crisp and steady for the entirety of the shot. If the camera is allowed to focus on a certain person’s face, it will track that individual automatically. In addition, the periscope-like head of the Osmo Pocket allows the camera to move seamlessly up, down, and sideways throughout a single take, allowing for the camera man to have more angles for the scene. Thanks to our Director of Cinematography, Cliff McKinney, who brought this device to my attention, the Osmo Pocket was ready to be put to use on our first day of filming over Christmas break. And a few days later, nobody on our team could have imagined undergoing the process without it. Even for simple walking shots, an iPhone camera looks jerky and becomes distracting to the viewer. The Osmo Pocket, in contrast, was so still that it appeared as though it were on an actual film track. Not only that, but when we had to film a chase sequence, the camera adjusted flawlessly to the rapid movements of both our cameraman and actors. The camera didn’t just stay on this side of the pond, either. As Christopher Nash ran over the slopes and through the townships of the Cotswolds in eastern England, his older brother, Will, chased after him in order to deliver the footage necessary for the movie’s opening scene. At one point, Chris ran over a hillside that happened to have a grazing flock of sheep down below. Both Chris and Will chased the sheep (in a non-threatening manner) as they made their way through the field, deliver a remarkable moment for our movie. After editing, the footage is of a quality virtually impossible on a smartphone or elite camera. The Osmo Pocket truly enabled us to film in ways that I would thought were out of reach for a group of high school students. Now finished with the project, I am so glad that such a camera is available to people our age to be as creative as possible when filming. And, on a semi-unrelated note, make sure to see La Orden in Trapier Theater this Sunday at 6:00pm! By Max Ross ‘20
Yes, as the title suggests, there are bendable phones. However, to understand the feat of this technology one must understand the history of the concept. In the 1970s, Xerox PARC theorized a bendable screen in the form of their Gyricon technology which was one of the first e-paper products. Gyricon consisted of a thin sheet of two-sided beads which were sensitive to positive and negative charges. Given the thinness of the screen, it could bend easily, but it could not produce color with the beads. Since the seventies, the problem with creating a bendable phone was finding a material that could bend, project colors, and be transparent. Advances halted up until 2008, when Nokia unveiled the “Morph Concept.” The phone was incredible. The “Morph Concept” was meant to be a phone but its uses were endless: you could bend it around your wrist to make a watch or pull it apart like taffy to make a tablet and more. Nokia continued to revolutionize this bendable technology up until 2011 when they released the Nokia “Kinetic Concept” which reverted to a static rigid position, only bendable in that back panel had a rump-like protrusion which gave the feeling of a bending phone. Aside from the advances of Nokia, other companies strove to win the race to build the first bendable phone. Companies like Samsung, LG, and even Apple filed a patent for a bendable phone. Despite promises to release this super-phone in the early 2010s, most companies failed to deliver and still haven’t today. Their problems lie in the resolution of OLED, the light-emitting technology which lights up your screen as you read, cannot be bent. The reason stems from the fact that OLED panels are composed of Indium Tin Oxide or ITO. ITO is crystalline and struggles to handle the force associated with bending a phone. If ITO is stressed too much then the material will lose performance. The immediate solution to this problem is a material called Graphene Oxide; however, at 500-2000 dollars per kilogram, the substance is exorbitantly priced and would make smartphones even more expensive. Although, some companies have managed to create a bendable phone without the substance. There is a catch in that the phones only bend on one hinge and are often buggy. For example, you can buy both the ZTE Axon (400$) and the Flexpai (1300$) phones right now. The problem is that they simply don’t work well. Reviews of both phones complain of laggy animations and constant phone orientation problems. Not to mention their problems along the hinge: either the rubber underneath falls apart or the screen on the hinge blacks out and does not work. However, these phones are still early iterations of a generation that may be catalyzed by Samsung’s release of their bendable phone at the end of 2019. LG and other phone companies may soon put a tablet in our pocket. After the explosive increase in a phone's screen size in the past five years, the next move is to reduce the the size of the phone in our pocket. Even the Apple-junkie may have a bendable iPhone in their pocket soon. Only time will tell whether or not this technology is viable. By Ilyas Talwar ‘20
If you’ve been paying any attention to scientific news over the past three months you’ve likely heard that back in December NASA’s Voyager 2 probe became the second man-made object to enter interstellar space. While this is certainly a great accomplishment it is still worth noting that the probe itself has not yet left our solar system, in fact, it will not leave for another 3,300 years. Scientists once considered humanity to be trapped on the Earth; then we landed on the moon. Now we consider ourselves to be trapped in our solar system, and the solution for that seems to be more difficult to find. Moreover, if we were to leave our solar system we wouldn’t find much if anything for a long time. The nearest star system to ours is called Alpha Centauri. It is a three-star system which contains a binary star system within it. If that sounds confusing don’t worry; the concept of a multi-star system is a very strange concept considering we all live in a single star system. The system consists of three stars: Alpha Centauri A, the largest star and the main star in the binary system; Alpha Centauri B, the second largest and secondary star in the AB binary system; and Alpha Centauri C, more commonly known as “Proxima Centauri” it is the smallest of the three and the closest to Earth. Proxima Centauri has one Earth-like planet orbiting it, Proxima B, and is about 4.24 light-years away which isn’t that bad in the grand scheme of things, and thanks to Russian billionaire Yuri Milner we might just have a way to get there. Milner’s plan named “Breakthrough Starshot” would consist of three main parts: a laser array, a laser sail, and a nano craft. Nowadays most unmanned space probes are powered by solar sails. Which operate exactly they sound they would. These solar sails collect energy from the Sun and use it to propel the spacecraft, these sails despite taking a while to gather momentum can propel spacecraft to incredibly high speeds. However, even with solar sails, it would take 54,000 years to reach Proxima B. Therefore Starshot intends to use a laser array on Earth to fire onto the metal sails of these probes. Then in under an hour, these probes could be propelled up to nearly 20% the speed of light, and would only take 20 or 30 years to reach Proxima B. These spacecraft would be incredibly light; their sails would only be a few hundred atoms thick and span only four meters in width and height. Furthermore, the spacecraft themselves would be only a little larger and thicker than a postage stamp. The current plan is to have a mother ship carry hundreds of these probes into space then release them to be powered by the lasers. Despite the well thought out nature of this plan it does come with many risks. Specifically, with the speed of the probes, if they hit anything, even just a piece of debris the size of a tiny pebble, they will be torn to shreds. Furthermore, with this intense speed, there is a chance that any structural weakness could lead to the probe breaking apart in space. However, if all goes according to plan the probes will reach Proxima B in nearly 30 years, take reading and pictures, and then transmit them back to Earth. The pictures won’t be very pretty as due to the speed of the probes they will be blue; furthermore, due to the distance, the pictures and reading will not reach Earth for four years after. However, with the backing of multiple billionaires and scientists, the possibility of Breakthrough Starshot actually working has now become a likely reality. by Rowan Tsao ‘21
As everyone knows, NCS Winfo was last weekend. That means the last minute dress shopping, the packs of people lingering at Open City waiting to film their friend’s “Asks,” and, of course, the endless social engineering and reconnaissance on our phones. Technology allows people to coordinate dinners, dresses, and dates, making planning for events like Winfo so much easier. An additional effect of technology and social media related to dances like Winfo is social pressure. For instance: the evolution of “the ask.” Instead of a good, old-fashioned, one-on-one awkward conversation, asking someone to a school dance has become a bigger, more elaborate event that we document and post on social media. Do your reconnaissance ahead and be certain he or she will say yes! “Asks” include a sign with a cute pun, or buying food and playing music, and of course, all of your friends have to be there to document the event. This creates added social pressure to have a fun, creative, post-worthy “ask.” As someone who likes to keep things low-key, this pressure always frustrates me; however, I still enjoy tagging along to my friends’ “asks,” as they can be fun and personal. Technology, of course, has become greatly incorporated into our daily lives. Some people may argue that our increased use of smartphones and social media make us antisocial at events such as Winfo, because we spend our time posing for photographs, then editing and posting them. This may be true, but I think that technology also allows us to enjoy the night together by taking pictures and using social media as a place where we can document our memories together. By Neechi Marupa-Ombima ‘20
The Airbus A380, a massive piece of technological mastery that flies through our skies has been cut from production by Airbus after less than 15 years in service. If you don’t know what an Airbus A380 is, it was the first modern aircraft to have two floors of seating available to passengers. The A380 is also the largest commercial jetliner employed by any airline for transporting passengers, surpassing Boeing’s 747 which only had a quarter of a second floor. To further understand one of the greatest technological feats of our generation one should look to the dimensions of this super jumbo. The weight of the A380 is probably the most shocking statistic as it comes in at a staggering 632 tons, a gravity defying number. Additionally, the A380 is nearly 80 ft tall and 240 ft long making it one of the largest moving man-made structures in the world. Not only is the A380 impressive in size but also in technological landscape.The jet is one of the most advanced aircraft of its time and utilizes semi-autonomous operation. This type of operation meant that pilot input was aided significantly through the use of computers allowing for near perfect accuracy in all stages of flight. For example, with the assistance of radar guided autopilot the A380 could line up its approach onto the runway for landing, or navigate through stormy weather without the pilots hands actively moving the joystick. Overall, the A380 was great at what it did; it transported hundreds of millions of people throughout its existence and became a passenger favorite for aircraft choice, so why did Airbus cancel its production? The reason for the cease in production is probably a lot more complicated then the general public can know, but the main reason has to do with the UAE airline Emirates. Emirates is the largest operator of the A380 since the airline first took delivery in 2008. The airline currently operates 108 of the jet with 54 more pending delivery. To put that number into perspective the next airline, Singapore Airlines, has only 24 A380s in its fleet. Therefore, when Emirates decided to reduce its incoming order from 54 new A380s to only 14 Airbus decided to cut the A380 program. Moreover, since Airbus relies so heavily on Emirate’s orders of the A380 to keep the program alive once Emirates cut its order their was very little hope left as each aircraft cost over 400 million dollars to make. For Emirates the move to decrease their A380 order is pretty complex. However, one clear and large part of the decision is the movement of the aviation industry. For the past couple decades, big international airlines like Emirates established hubs where almost any flight operated by the airline would pass through. Emirates’ hub is located in Dubai and Emirates like other airlines used their hub to concentrate passenger traffic and operate its flights through a hub and spoke system where new routes would all originate from the hub. However, in recent years with the introduction of more fuel efficient aircraft like the Boeing 787 and the Airbus A350, the hub and spoke system is becoming phased out of many airlines. Instead, with these new aircraft with much longer operating ranges due to more advanced engine systems, a direct route system is being employed by more and more airlines. The direct route system allows access for the most lucrative customer base of business travelers who prefer going directly from point a to b rather than passing through a hub with a long layover. Additionally, the direct route system saves airlines not only fuel cost but also promotional costs as less of an effort is needed to promote the sale of 300 seats on the 787 compared to the 500+ seats of the A380. Ultimately, Emirates’ reasons for cutting their order are definitely valid as the company was underperforming in profits compared to its competitors, and the decision will send the airline and the future of commercial aviation in the right direction |
|