Learning to roll with being ‘good enough’ at work and parenting

Learning to roll with being ‘good enough’ at work and parenting

I wondered if they were talking about me, but it was a few weeks before I had time to sit down and read it without interruption. When one editor asked me to submit clips of my longform writing, I realized it had been nearly a decade since I’d written anything longer than 1,000 words. There, I said it. And I am trying not to feel bad about it anymore. Motherhood, especially when you have more than one child, is wonderful, miraculous and everything the women who came to your baby shower said it would be. But it’s also all-encompassing in a way that I never could have imagined before I had kids. There is always something to be done: meals to be made, messes to be cleaned, homework to be supervised, activities to be shuttled to, cuddles to be given (that last one is, admittedly, my favorite). And just when I think I’ve crossed everything off my to-do list and can finally sit down for a moment and maybe pitch an idea that’s been rolling around in my brain, or set up an interview with a source, something else pops up that needs my attention. When I give my 3-year-old daughter the iPad or turn on the TV so I can get some work done, I think that I should be playing with her, rather than potentially stunting her brain development so I can be creatively fulfilled. When I take an hour to exercise, I worry that I should be writing, that I am squandering my potential, that all of my free time needs to be devoted to my craft. When I’m cleaning or straightening up the house, I feel guilty that I’m not using the time to actively engage with my children. When I just want to watch “This Is Us” for an hour by myself after the kids are in bed, I feel bad that I’m not spending that time watching TV with my husband, whom I haven’t seen all day. On more than one occasion, I have felt paralyzed by what I’d like to accomplish in a day — and how I’m going to get it all done. This constant feeling of falling short was becoming a major source of unhappiness for me. And that also made me feel terrible because, all things considered, my life is a good one. I continued to wallow in this until one afternoon, as I was scrolling through my Facebook feed (something that also feeds my “never enough” anxiety) I saw a post from a mom asking for advice on how to balance a creative career with parenting. Another mother, with grown children, replied with the idea that there are “seasons to life.” That even if you aren’t doing everything you want to be doing right this second, it’s okay, because someday you will have the time to devote to whatever it is you desire. Several women were interviewed for the piece, and as I read about their experiences, I felt as though I was among my people. Wow. This was an aha moment for me. Yes, I know that my children won’t always need me the way they do now (and that I’ll probably mourn this), but dividing my life into “seasons” allowed me to embrace the fact that the kids are my focus, without feeling guilty or inadequate about what can’t be. It was freeing. I finally let go of the breath I didn’t realize I had been holding. So as I approach my 40s, I am trying to be more mindful of the metaphorical marathon that I, and so many of my fellow moms, are running. I am trying to be more present for my children mentally, not just physically. But our conversation soon turned to how hard it is to keep the momentum going on the successful careers we had cultivated before we became parents. “I’m terrified of becoming irrelevant,” she confided in me. Same here. I scaled back on freelancing when my son was born almost seven years ago, and at the time, I was fine with this. It was financially feasible, I had a newborn to take care of, and I was content to throw my entire being into that work. When my daughter was born in 2014, I again cut back on writing, but this time, when the itch to freelance returned, it was much harder to scratch it. So many colleagues I knew from my days as a magazine editor had fled the uncertainty of the print industry for jobs in content branding or the digital space, and I found myself cold-pitching editors I didn’t know at publications for which I had written years earlier.

Read More

IKEA & Teenage Engineering Announce FREKVENS ‘Party’ Collaboration

IKEA & Teenage Engineering Announce FREKVENS ‘Party’ Collaboration

The line is still early in the design stages. “We are just starting to shape the collection; it’s a work in progress. “In FREKVENS, we want to make products that everybody can grasp and handle,” says Teenage Engineering CEO Jesper Kouthoofd. “Even those who are not so tech-savvy should swiftly be able to understand and use the products. “Designing FREKVENS, we want to make something that feels like IKEA, and at the same time challenge how we perceive them today,” adds Kouthoofd. “IKEA is furniture, meatballs and soon… Party!”

Read More

AI Smartphones Will Soon Be Standard, Thanks to Machine Learning Chip

AI Smartphones Will Soon Be Standard, Thanks to Machine Learning Chip

Almost every major player in the smartphone industry now says that their devices use the power of artificial intelligence (AI), or more specifically, machine learning algorithms. Apple has already designed and built a “neural engine” as part of the iPhone X’s main chipset, to handle the phone’s artificial neural networks for images and speech processing. The MIT Tech Review notes, however, that ARM’s track record for energy-efficient mobile processors could translate to a more widespread adoption of their AI chip. ARM doesn’t actually make the chips they design, so the company has started sharing their plans for this AI chip to their hardware partners — like smartphone chipmaker Qualcomm. That might soon change: thanks to a processor dedicated to machine learning for mobile phones and other smart-home devices, AI smartphones could one day be standard. British chip design firm ARM, the company behind virtually every chip in today’s smartphones, now wants to put the power of AI into every mobile device. Project Trillium would make this process much more efficient. Their built-in AI chip would allow devices to continue running machine learning algorithms even when offline. “We analyze compute workloads, work out which bits are taking the time and the power, and look to see if we can improve on our existing processors,” Jem Davies, ARM’s machine learning group head, told the MIT Technology Review. With the advantages machine learning brings to mobile devices, it’s hard not to see this as the future of mobile computing.

Read More

“Smarticle” Robot Swarms Turn Random Behavior into Collective Intelligence

“Smarticle” Robot Swarms Turn Random Behavior into Collective Intelligence

In a lab at the Georgia Institute of Technology, physicists run experiments with robots that look as though they came from the dollar store. The work with these robots, known as “smarticles,” is part of a broader interest in the feasibility and applications of self-organizing robots. In many of these cases the idea is to mimic emergent phenomena found in nature, like the regimented motion of a decentralized colony of army ants or the unconscious, self-programming assembly of DNA molecules. “We know what we want the collective to do, but in order to program it we need to know what each agent must be doing on the individual level,” said Melvin Gauci, a researcher at Harvard working on swarm robotics. “Going between those two levels is what’s very challenging.” Beware of Leaders Daniel Goldman is a physicist at Georgia Tech who is leading the experiments with smarticles (a portmanteau of “smart active particles”). His fundamental scientific interest is in the physics of active granular materials that have the ability to change their own shape. In a slide deck he brings to conferences, he includes a clip from Spider-Man 3 that shows the birth of the supervillain Sandman: Loose grains of sand skitter across the desert and then congeal into the shape of a man. Smarticles are Goldman’s way of testing active granular materials in a lab. “They give us a way to use geometry to control the properties of a material. They can also be programmed to adjust the rate at which they swing their arms in response to the other smarticles they encounter in their immediate vicinity. These maneuvers could serve as building blocks for more complicated feats, but even the most basic functions, like compression, are hard to engineer when none of the smarticles have any idea where they’re positioned in relation to the overall group. It can’t see, it has limited memory, and the only thing it knows about the other smarticles it’s supposed to coordinate with is what it can learn from bumping into its immediate neighbors. “Imagine one person at a rock concert with his eyes closed,” said Joshua Daymude, a graduate student in computer science at Arizona State University who works on the smarticles project. One strategy would be to appoint a leader that orchestrates the swarm, but that approach is vulnerable to disruption—if the leader goes down, the whole swarm goes down. Another is to give each robot in the swarm a unique job to perform, but that’s impractical to implement on a large scale. “Individually programming 1,000 robots is basically an impossible task,” said Jeff Dusek, a researcher at Olin College of Engineering and a former member of the Self-Organizing Systems research group at Harvard, where he worked on underwater robot swarms. But “if every agent is following the same set of rules, your code is exactly the same whether you have 10 or 1,000 or 10,000 agents.” An algorithm used to program a swarm has two properties. First, it’s distributed, meaning it runs separately on each individual particle in the system (the way each army ant carries out the same simple set of instructions based on whatever it senses about its local environment). This means that if an army ant senses, say, five other army ants right around it, maybe there’s a 20 percent chance it moves to the left and an 80 percent chance it moves to the right. Random Guarantees In 2015, Goldman and Randall were discussing the possibility of finding rules that would lead Goldman’s smarticles to act coherently as a group. Randall realized that the swarm behaviors Goldman was after were similar to the behavior of idealized particle systems studied in computer science. “I was like, ‘I know exactly what’s going on,’” Randall said. In the late 1960s, the economist Thomas Schelling wanted to understand how housing segregation takes hold in the absence of any centralized power sorting people into neighborhoods by skin color. When the person moved, Schelling transported him to a random spot in the housing grid where he repeated the algorithmic process of observing his neighbors and deciding whether to stay or go. Schelling discovered that, according to his rules, residential segregation is virtually guaranteed to take hold, even if individuals prefer to live in diverse neighborhoods. And in Schelling’s model the decisions can be made with an element of randomness—if your neighbors look different from you, maybe there’s a high probability you move, but also some small probability you choose to stay put. Randall and her co-authors proved that if they weighted the die correctly, they were guaranteed to end up with a compressed swarm (in the same way Schelling could have proved that if he set individuals’ tolerance for diversity at the right level, segregation was unavoidable). The randomness in the algorithm helps particles in a swarm avoid getting stuck in locally compressed states, where lots of isolated subgroups are clustered together but the swarm as a whole isn’t compressed. The randomness ensures that if smarticles end up in small compressed groups, there’s a chance individuals will still decide to move to a new location, keeping the process alive until an overall compressed state is reached. (It takes just a little randomness to nudge particles out of locally compressed states—it takes a lot more to nudge them out of a globally compressed state.) Into the World Proving that particles in a theoretical world can run a simple algorithm and achieve specific swarm behaviors is one thing. Actually implementing the algorithm in cheap, faulty, real-life smarticles clanking around in a box is another. “Our theory collaborators are coming up with ways to program these things, but we’re just in the beginning and we can’t yet say these schemes have been transferred directly,” Goldman said. But one day the physicists were observing this chaotic motion when the battery died in one of the smarticles. Goldman and his collaborators noticed that the swarm suddenly started moving in the direction of the inactive unit. The work led to the recent development of the algorithm that will always get an idealized swarm to move in a specified direction. The researchers hope to eventually prove theoretically that a basic algorithm, implemented in a distributed way in a large collection of small, cheap robots, is guaranteed to produce a specified swarm behavior. “We’d like to move to a point where it’s not that batteries died and we found a phenomenon,” Daymude said. “We’d like it to be more intentional.” Reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences. Researchers are learning how to control these systems so that they function in a manner similar to swarms of bees or colonies of ants: Each individual operates in response to the same basic set of instructions. But when the swarm comes together, its members can carry out complex behaviors without any centralized direction. “Our whole perspective is: What’s the simplest computational model that will achieve these complicated tasks?” said Dana Randall, a computer scientist at Georgia Tech and one of the lead researchers on the project. “We’re looking for elegance and simplicity.” As a computer scientist, Randall thinks about the problem in algorithmic terms: What is the most basic set of instructions individual elements in a swarm can run, based on the meager data they can collect, that will lead inevitably to the complex collective behavior researchers want?

Read More

Throwback Thursday: Teaching And Learning Are A Shared Responsibility (Views: 1178)

Throwback Thursday: Teaching And Learning Are A Shared Responsibility (Views: 1178)

Teaching has been a passion of mine since I began riding horses as a young girl. Let me know if any of the subjects I touch on need more explanation. Learn by making the right correction, not by telling your instructor what she already sees. Most horses do what they are told to do. If you are giving correct aids and you are not getting what you ask for, there are ways to convince the horse to be more generous. Example: Student B crosses the half diagonal on the left lead. When she reaches the track, she asks for a flying change by putting the left leg back and presses with it so unfeelingly that the horse actually goes in counter canter with the haunches to the inside, but doesn’t make a flying change. (This is impossible to do if you actually intend to do it, but I saw a student create this very scenario a few weeks ago!!) The rider yells over her right shoulder, “I am asking and he is not doing it!” as she rides down the long side in haunches in/counter canter, pressing firmly with the left leg…. In reality, this rider was getting exactly what she asked for, so we had to go over the flying change aids again. When asked properly from a light tap of the spur reinforced by a tap of the whip so that there could be no misunderstanding, the horse did the flying change. Example: Student C wants to know how to correct her passage to piaffe transition. So she does it her way again to show me what she knows rather than trying to learn what I am teaching her. Now, this lady’s horse falls on the forehand in the transition to piaffe because she crams her lower leg back on the barrel and tips forward on her seatbones while pulling the head down in a tight frame. She needs to put her own weight on the back of her seat bones, tap the horse nearer to the girth with her spurs so that he will lift his withers, and allow him to lift his poll by giving with her hand. She stops and says, “I’ve been taught to keep the same contact between piaffe and passage.” So I repeat, “Give on the reins in the piaffe.” She tries again, this time with the spur at the girth, which drops her back on her seatbones and starts to elevate the horse’s withers. When she gives on the reins his nose comes to the vertical and his poll is at the highest point. Now, with any one of these scenarios, if the student had just NOT SPOKEN and instead tried to do what she was told, she would have been further along in correcting the mistakes in a shorter amount of time. I like to teach riders from all levels of the sport. I could have taken the time to explain all the tiny details of every correction, but the rider ALWAYS learns more if he or she actually does it first and receives the explanation afterward. If you talk first, the momentum and impulsion you need to achieve your goal are already lost. Don’t bring the frustrations of your daily life with you to your lesson. Let them go at the entrance to the arena. Your riding instructor is there to help you with your riding, not your marriage, divorce or stress-related problems. Every successful lesson I have ever taught has involved the improvement of a rider’s basic training—be that in a sitting trot lesson without stirrups, or in training the half-halt to improve self-carriage in passage. Don’t tell the teacher how to teach you or what you need to learn. Rita, I have been both a student and a teacher my whole life. I cannot completely separate one from the other, but I can tell you that as I progress and learn as a dressage rider, this sport continues to fascinate me every day. Training Tip of the Day: Can you begin a riding lesson (as a student or a teacher) with a clear mind and maintain that state throughout the lesson? I had taken a hiatus from teaching during the six years of my sponsorship with the JSS Trust because I had the financial support to concentrate solely on training and competing. Now I am on the road again teaching a lot of clinics, and I am very pleased to have new opportunities to share my knowledge with other riders. The concept of sharing responsibility for a good riding lesson—50/50 between the student and the instructor—is strong with me now. Nowadays, the more I learn, the more I want to teach. Firstly, a good dressage teacher must have an appropriate level of technical understanding for the level of student that rides in front of her/him. If you’re trying to fix problems in the right half-pass without ever having ridden one, you might be in for a bit of a struggle. If a rider comes to me for a lesson because she is struggling with pirouettes, I not only need the technical knowledge of how to ride one, but I also must develop the ability to observe her pirouettes with a skillful eye and offer a solution. 1. Evaluation: What is the basic problem with the pirouette? The rider is sitting too far to the outside and pushing the haunches too far to the inside. The more I teach, the more I want to learn! Begin the pirouette with shoulder-in on the centerline with the rider positioning her upper body more over the inner seatbone. This threefold approach can only be practiced by an instructor who possesses good technical skill AND the ability to evaluate, analyze and correct. Thirdly, and perhaps most importantly, the best teachers in the world have a generous spirit. They have stepped into the arena not to show off what they know, but to tell a student what she or he must hear in order to learn. I cannot abide an instructor who steps into the arena and tells me everything that is going wrong. This is an ego-based action, and I have no time for it. I KNOW what is going wrong; what I need to hear is how to fix it.  A good instructor does not step into the arena to show off, but rather, to HELP. Having said that, what a student needs to hear is different for every individual, and if an instructor (especially a clinician) is going to be successful, he/she must have the ability to figure that out in a timely manner. Let’s say I have Student A, Student B and Student C in a clinic. The more I teach, the more I appreciate the students who come to me with a centered, quiet mind, ready to listen, absorb and learn. Student A needs to hear: “Listen to the timing of the canter. Hear the rhythm of one-and –two, and one-and-two, and one-and-two. Student B needs to hear: “Ride counter canter across the half diagonal. Now how each of these horses and riders responds to the initial instruction may warrant an adjustment in the approach. And perhaps more importantly, an understanding of how you can get the student to respond in a way that helps them learn. I used to believe that giving a good riding lesson was solely my responsibility. She now professionally trains and shows the horses in my stable. Fifteen years ago, she needed to learn a lot of technical things (like how to count the flying changes). Today, she is still learning technical stuff (like how to speed up the piaffe) but she learns at an incredibly rapid pace now compared to 15 years ago. What has been consistent over the years is that Casey has always learned from FEEL, and she has always learned AFTER the lesson—when I’m not watching her anymore. I have to find ways to make her feel the things I want to see changed, and then I have to go away to let her learn it while I am not watching. This was not a comfortable process for me in the beginning, but it functions well for us now. I look up from a concentrated ride a few days later and viola!—there is the picture I want to see. I realize now, after many decades of teaching, that a student must also take responsibility for the outcome of their lesson. I tend to learn more from Morten the day after he is gone. This just goes to show you that the relationship between each instructor and student will function in various and mysterious ways! Of course, when you have 15 years with the same student, you have the luxury of taking your time to figure out how they learn from you. A clinic situation is time sensitive, and the difference between a good and bad clinician is how quickly one can determine what works for each student. If a student is not willing to do that, he or she will not be long in my company. Anybody can give 10 riding lessons in a day.  Very few people are focused enough to give 10 GOOD riding lessons! Step into the arena to offer guidance, not to show your own knowledge. Correct one thing that is possible to correct in the moment. Do the best job that you can and then detach yourself from the outcome. The rest is up to the student. Now let’s turn to the responsibility of the student. I am not kidding, Rita!) Let the instructor do his/her job without your guidance. Rita, this is going to be a lengthy piece that contains a lot of information. When she puts her left leg back, the horse swings the haunches out and comes simultaneously against the outside leg and against the hand. The rider tries to correct this by pushing with the left leg but the horse just tosses his head and swings his haunches more into that leg. I say: “You should bend him right in that moment.” Bending the horse to the right, which requires use of the RIGHT rein and leg, breaks up the resistance the horse is offering to the right rein (the actually crux of the problem) and puts the horse in a position to move his haunches away from the left spur when the bend created under the saddle follows through to the hindquarters. I know of course what you are trying to do. I’ve just offered you a solution, but you didn’t hear it because you want to tell me what I already see.

Read More

Inside the Chilling World of Artificially Intelligent Drones

Inside the Chilling World of Artificially Intelligent Drones

According to Russian military spokesmen, the drones were equipped with barometric sensors that allowed them to climb to a preselected altitude, an automatic leveling system for their control surfaces, and precision GPS guidance that would have taken them to their pre-selected targets had they not been intercepted. Chamayou’s continuum collapses when it’s no longer a case of humans killing humans but of a robot and its algorithms initiating the carnage. While it may be years before that kind of “hard” or “complex” AI—the programs that allow a machine to learn and exercise autonomy—are used by terrorists and other non-state actors, it will happen. The primitive AI that guided the drones toward the Russian bases in Syria and that allows AQAP to use off-the-shelf drones to conduct surveillance in Yemen was, just a few years ago, something that was only available to states. The availability of this technology comes at a time when militant groups like ISIS and AQAP are calling for—and supporting—what they call “lone wolf” attacks on targets in the West. While these groups have few qualms about killing those they deem to be infidels, at the level of the individual operative there is almost always doubt, anxiety, fear, and even guilt. A member of a terrorist organization like ISIS could thus launch a “fly and forget” drone on a mission to release a bomb or chemical agent without mistake-inducing fear or anxiety. The operative is entirely removed from the act of killing and that makes it far easier to carry out. Advanced AI paired with drone technology has the potential to overcome even the most effective countermeasures because it dramatically lowers and even eliminates the psychological, physical, and monetary costs of killing. Yuval Noah Harari, author of Sapiens and Homo Deus: A Brief History of Tomorrow, argues that terrorism is a show or spectacle that captures the imagination and provokes states into overreacting. He suggests that “this overreaction to terrorism poses a far greater threat to our security than the terrorists themselves.” The attacks on 9/11 successfully provoked the U.S. government into starting its war on terror, now in its seventeenth year. The incident was an ominous portent of what the world will soon face as governments race to develop smaller, more intelligent, and ultimately wholly autonomous drones. In addition to invading Afghanistan and Iraq, the unintended consequences of which continue to reverberate, the war on terror has driven the rapid development of drones and AI. In many respects, drones are the perfect tools for states. They offer deniability, there are no images of flag-draped coffins, they do not get PTSD, they do not question orders, and they never entertain doubts about what their algorithms tell them to do. Unfortunately for all of us, they’re the perfect tool for terrorists and militants who are less constrained by political agendas, bureaucratic structures, and, to some degree, ethical considerations than states are. In Robert Taber’s timeless book, War of the Flea, he uses the analogy of the dog and its fleas. The state and its military forces are the dog and the guerrilla forces are the fleas attacking it. The dog is of course far bigger and more powerful than the fleas, but it can do little because its enemies are too small and too fast, while it is too big, too slow, and has too much territory to defend. Major General Latiff’s call for governments to slow down the development of this technology and assess the consequences of its inevitable leakage into the public sphere should be heeded if we are to avert the kind of outcome foreseen by the AI experts at the Campaign to Stop Killer Robots. While states and the corporations that work for them remain in control of the most advanced military and surveillance technologies, they face the perennial problem of leakage: the inevitable diffusion of technology into the wider world. In November 2017, the campaign released a short dramatic film entitled Slaughterbots that clearly shows where the technology is headed and how it can be used by terrorists and states. There will be more attacks like the one on the Russian base, and as the drones get smaller and more intelligent, they’ll start to look more and more like those slaughterbots. The 13 crudely made aircraft, which were powered by small gas engines and flew on wings fashioned from laminated Styrofoam, zeroed in on their targets: the vast Russian army base at Khmeimim and the naval base at Tartus on the Syrian coast. The radar signature of the drones was minimal and by taking advantage of a cool night, they were able to fly at low altitudes and avoid detection. It is unclear whether or not the drones were able to communicate with one another and thus behave as a swarm. Russian forces, it is claimed, detected the drones and, through a combination of kinetic and electronic air defense systems, destroyed some of them. According to Russian military spokesmen, the drones were equipped with barometric sensors that allowed them to climb to a preselected altitude, an automatic leveling system for their control surfaces, and precision GPS guidance that would have taken them to their pre-selected targets had they not been intercepted. The incident was an ominous portent of what the world will soon face as governments race to develop smaller, more intelligent, and ultimately wholly autonomous drones. While states and the corporations that work for them remain in control of the most advanced military and surveillance technologies, they face the perennial problem of leakage: the inevitable diffusion of technology into the wider world. However, what is new and what the attack on the Russian bases in Syria demonstrates is that non-state actors are—just like states—becoming more capable of building and using drones that have minds—albeit primitive ones—of their own. In his prescient and timely book Future War: Preparing for the New Global Battlefield [1], Major General (Ret.) Robert Latiff argues that we are at a point of divergence where technologies are becoming increasingly complex while our ability and willingness to understand them and their implications is on the decline. He asks, “Will we allow the divergence to continue unabated, or will we attempt to slow it down and take stock of what we as a society are doing?” At this point, there is little evidence that governments or the societies they preside over are undertaking the kind of probing reassessment of technology that Latiff calls for. On the contrary, they’re competing to develop ever-more advanced drones and the AI that will ultimately allow them to think for themselves. In response, governments increasingly need to build and design a host of electronic and kinetic countermeasures to thwart the use of drones by non-state actors. The threat posed by drones is so difficult to overcome that even the Russians, who are at the forefront of electronic counter-measures, are using trained falcons to guard the Kremlin against the smallest drones. A dangerous cycle has thus begun: governments and the corporations they rely on are driving the development of unmanned technologies and AI. This, in turn, will require ever-more advanced and costly countermeasures to defend the same governments against the technology that has and will leak out. In addition to setting in motion this cycle, the spread of drone technology and AI threatens to overwhelm even the most advanced countermeasures. Few technologies are so capable of lowering or eliminating the psychological, physical, and monetary costs of killing as drones, and it is this subtle yet profound effect that may pose the greatest threat. “Shotguns, everyone wanted shotguns,” an Iraqi commander said when asked about the weeks in late 2016 when ISIS first began using drones to drop small bombs on Iraqi soldiers. “The quads [quad-copters] are the hardest to hear, see, and hit. However, what is new and what the attack on the Russian bases in Syria demonstrates is that non-state actors are—just like states—becoming more capable of building and using drones that have minds—albeit primitive ones—of their own. In January 2017, ISIS declared that it had formed the “Unmanned Aircraft of the Mujahedeen,” a unit devoted to drones. While the group has been using drone technology for surveillance and targeting for at least two years, the October attack in Syria marked the debut of its armed drones. “We watched how they got better and better at hitting us,” explained the same Iraqi commander. “First they send a drone in as a spotter, unarmed that they use to figure out where we’re most vulnerable—an ammo cache, a patio where men are cooking or relaxing. Then they send an armed drone to those coordinates, often at night or in the very early morning when the winds are calm.” ISIS claims to have killed in excess of two hundred Iraqi soldiers with its drones. ISIS, just like the governments that are fighting it, realizes that drones are the future of warfare. At the same time that groups like ISIS are devoting more and more resources to developing their drone warfare capability, governments and corporations are racing to develop countermeasures. Late in 2016 and in early 2017, soldiers in Iraq and Syria—especially Iraqi soldiers—had few options to defend themselves beyond firing their weapons into the skies. Within weeks of the first attack by ISIS using an armed drone in October 2016, countermeasures, many of which were already under development, were rushed from laboratories to the battlefield. In his prescient and timely book Future War: Preparing for the New Global Battlefield, Major General (Ret.) Robert Latiff argues that we are at a point of divergence where technologies are becoming increasingly complex while our ability and willingness to understand them and their implications is on the decline. These range from a variety of electronic “drone guns”—which cost tens of thousands of dollars—that jam drones’ ability to receive signals from their operators to shotgun shells that are loaded with a wire net designed to entrap a drone’s propellers and thus bring it to the ground. Groups like ISIS are already developing ways of electronically hardening their drones and adjusting their strategies to make them less susceptible to countermeasures. “They say the rocks have ears,” explained a Yemeni journalist who studies and writes about al-Qaeda in the Arabian Peninsula (AQAP). “They’re sensors but they look just like rocks. The sensors, hidden in plasticized containers designed to mimic the rocks of the areas where they are dropped, were likely part of the U.S. government’s effort to combat AQAP in Yemen. He asks, “Will we allow the divergence to continue unabated, or will we attempt to slow it down and take stock of what we as a society are doing?” At this point, there is little evidence that governments or the societies they preside over are undertaking the kind of probing reassessment of technology that Latiff calls for. The sensors, some of which are solar powered, can lie dormant for years and be programmed to activate by anything from ground vibrations to the sound signature of a specific automobile engine. Once on they can remain passive, continuing to collect information, or they can signal a drone to come and investigate or neutralize a target. Since then, the U.S. government has continued to deploy and use a range of drones to hunt and kill those who end up on its kill lists as well as so-called targets of opportunity that happen to be in designated “kill boxes.” The exact number of individuals killed by drones in Yemen is unknown as is the number of civilians killed as a result of these attacks. This is despite the fact that the U.S. government has spent billions of dollars fighting an organization that—at certain points in its history—had fewer than 50 dedicated operatives. It has also forced it to develop a range of countermeasures, including trying to co-opt the AI that drives—at least to some degree—target selection. The 13 crudely made aircraft, which were powered by small gas engines and flew on wings fashioned from laminated Styrofoam, zeroed in on their targets: the vast Russian army base at Khmeimim and the naval base at Tartus on the Syrian coast. On the contrary, they’re competing to develop ever-more advanced drones and the AI that will ultimately allow them to think for themselves. “They claim that they have let the drones kill some of their rivals,” a Yemen-based analyst explained. “They planted phones in cars that were carrying people they wanted to be eliminated and the drones got them.” While these claims cannot be verified, AQAP knows that data from phones such as voice signatures and numbers are vacuumed up by sensors and fed into the algorithms that—at least partly—help analysts decide whom to target. This data is the equivalent of digital blood spoor for the drones that are hunting them. In another recent video, the emir of AQAP, Qasim al-Raymi, decried his operatives’ inability to refrain from using their phones, claiming that most of the attacks on them over the last two years have been due to the use of cell phones by its operatives. However, given the organization’s expertise with explosives and the increasing availability of military-grade small drones (the UAE and Saudi Arabia are providing these to the forces they support in Yemen) on the black market, it is only a matter of time until they make their debut in Yemen or elsewhere. In the besieged Yemeni city of Taiz where AQAP has been present for at least the last two years, off-the-shelf and modified drones have been used by all sides in the conflict for surveillance. Just as in Iraq and Syria, various groups, including AQAP, are becoming more and more adept at deploying drones with primitive but effective AI. In Taiz and in other parts of Yemen where Houthi rebels and various factions aligned with Saudi Arabia and the United Arab Emirates are fighting for control, semi-autonomous drones are being used to map enemy positions and monitor the movements of rival forces. These drones are programmed to fly to a preselected set of waypoints that, when desired, allows them to move in a grid pattern, thereby providing a comprehensive view of a particular area. This kind of persistent and low-cost surveillance is critical—just as it is on a far larger and more precise scale for the U.S. government’s drones—for determining patterns of life for an individual or group prior to targeting them. While there are no signs that militant groups like AQAP or ISIS are close to employing more advanced AI that would allow them to use drones to identify and target specific individuals, these groups and others will, in a short period, have access to such technology. Face and gait recognition software and the high-pixel cameras that allow it to function are also widely available and undoubtedly already being used by well-funded non-state actors like Hezbollah. In response, governments increasingly need to build and design a host of electronic and kinetic countermeasures to thwart the use of drones by non-state actors. While this drone was reportedly being flown by the employee for recreational purposes, its ability to penetrate the airspace around one of the most secure buildings in the nation’s capital proved how vulnerable these sites are to drone-based attacks and surveillance. Apart from the stunning and multifold implications that this technology has for state security, the use of drones has had a more subtle yet profound effect on those who use them. Chamayou struggles to situate drones on a continuum of weapons used to hunt and kill other humans that correlates the proximity between the hunted and the hunter. The most intimate form of killing is hand-to-hand combat and the most distant is the pilot releasing his payload of bombs at thirty-thousand feet or the officer ordering a missile to be launched at some distant target. The threat posed by drones is so difficult to overcome that even the Russians, who are at the forefront of electronic counter-measures, are using trained falcons to guard the Kremlin against the smallest drones. Yet, just like the launch officer or to a lesser degree the bomber pilot, the drone operator is out of reach. In the case of the U.S. government, he or she is likely thousands of miles away and invulnerable to harm. Yet there is a kind of one-way intimacy between the hunter and the hunted, even though it is pixilated and mediated through screens—enough to unsettle and traumatize many of the soldiers who are charged with operating the drones that hunt and kill in countries like Yemen. Chamayou’s continuum collapses when it’s no longer a case of humans killing humans but of a robot and its algorithms initiating the carnage. While it may be years before that kind of “hard” or “complex” AI—the programs that allow a machine to learn and exercise autonomy—are used by terrorists and other non-state actors, it will happen. A dangerous cycle has thus begun: governments and the corporations they rely on are driving the development of unmanned technologies and AI. This, in turn, will require ever-more advanced and costly countermeasures to defend the same governments against the technology that has and will leak out. The primitive AI that guided the drones toward the Russian bases in Syria and that allows AQAP to use off-the-shelf drones to conduct surveillance in Yemen was, just a few years ago, something that was only available to states. The availability of this technology comes at a time when militant groups like ISIS and AQAP are calling for—and supporting—what they call “lone wolf” attacks on targets in the West. While these groups have few qualms about killing those they deem to be infidels, at the level of the individual operative there is almost always doubt, anxiety, fear, and even guilt. A member of a terrorist organization like ISIS could thus launch a “fly and forget” drone on a mission to release a bomb or chemical agent without mistake-inducing fear or anxiety. In addition to setting in motion this cycle, the spread of drone technology and AI threatens to overwhelm even the most advanced countermeasures. The operative is entirely removed from the act of killing and that makes it far easier to carry out. Advanced AI paired with drone technology has the potential to overcome even the most effective countermeasures because it dramatically lowers and even eliminates the psychological, physical, and monetary costs of killing. Yuval Noah Harari, author of Sapiens and Homo Deus: A Brief History of Tomorrow [3], argues that terrorism is a show or spectacle that captures the imagination and provokes states into overreacting. He suggests that “this overreaction to terrorism poses a far greater threat to our security than the terrorists themselves.” The attacks on 9/11 successfully provoked the U.S. government into starting its war on terror, now in its seventeenth year. In addition to invading Afghanistan and Iraq, the unintended consequences of which continue to reverberate, the war on terror has driven the rapid development of drones and AI. In many respects, drones are the perfect tools for states. They offer deniability, there are no images of flag-draped coffins, they do not get PTSD, they do not question orders, and they never entertain doubts about what their algorithms tell them to do. Unfortunately for all of us, they’re the perfect tool for terrorists and militants who are less constrained by political agendas, bureaucratic structures, and, to some degree, ethical considerations than states are. Few technologies are so capable of lowering or eliminating the psychological, physical, and monetary costs of killing as drones, and it is this subtle yet profound effect that may pose the greatest threat. In Robert Taber’s timeless book, War of the Flea, he uses the analogy of the dog and its fleas. The state and its military forces are the dog and the guerrilla forces are the fleas attacking it. The dog is of course far bigger and more powerful than the fleas, but it can do little because its enemies are too small and too fast, while it is too big, too slow, and has too much territory to defend. Major General Latiff’s call for governments to slow down the development of this technology and assess the consequences of its inevitable leakage into the public sphere should be heeded if we are to avert the kind of outcome foreseen by the AI experts at the Campaign to Stop Killer Robots. In November 2017, the campaign released a short dramatic film entitled Slaughterbots [4] that clearly shows where the technology is headed and how it can be used by terrorists and states. There will be more attacks like the one on the Russian base, and as the drones get smaller and more intelligent, they’ll start to look more and more like those slaughterbots. “Shotguns, everyone wanted shotguns,” an Iraqi commander said when asked about the weeks in late 2016 when ISIS first began using drones to drop small bombs on Iraqi soldiers. “The quads [quad-copters] are the hardest to hear, see, and hit. In January 2017, ISIS declared that it had formed the “Unmanned Aircraft of the Mujahedeen,” a unit devoted to drones. While the group has been using drone technology for surveillance and targeting for at least two years, the October attack in Syria marked the debut of its armed drones. “We watched how they got better and better at hitting us,” explained the same Iraqi commander. “First they send a drone in as a spotter, unarmed that they use to figure out where we’re most vulnerable—an ammo cache, a patio where men are cooking or relaxing. Then they send an armed drone to those coordinates, often at night or in the very early morning when the winds are calm.” ISIS claims to have killed in excess of two hundred Iraqi soldiers with its drones. ISIS, just like the governments that are fighting it, realizes that drones are the future of warfare. The radar signature of the drones was minimal and by taking advantage of a cool night, they were able to fly at low altitudes and avoid detection. At the same time that groups like ISIS are devoting more and more resources to developing their drone warfare capability, governments and corporations are racing to develop countermeasures. Late in 2016 and in early 2017, soldiers in Iraq and Syria—especially Iraqi soldiers—had few options to defend themselves beyond firing their weapons into the skies. Within weeks of the first attack by ISIS using an armed drone in October 2016, countermeasures, many of which were already under development, were rushed from laboratories to the battlefield. These range from a variety of electronic “drone guns”—which cost tens of thousands of dollars—that jam drones’ ability to receive signals from their operators to shotgun shells that are loaded with a wire net designed to entrap a drone’s propellers and thus bring it to the ground. Groups like ISIS are already developing ways of electronically hardening their drones and adjusting their strategies to make them less susceptible to countermeasures. “They say the rocks have ears,” explained a Yemeni journalist who studies and writes about al-Qaeda in the Arabian Peninsula (AQAP). “They’re sensors but they look just like rocks. It is unclear whether or not the drones were able to communicate with one another and thus behave as a swarm. The sensors, hidden in plasticized containers designed to mimic the rocks of the areas where they are dropped, were likely part of the U.S. government’s effort to combat AQAP in Yemen. The sensors, some of which are solar powered, can lie dormant for years and be programmed to activate by anything from ground vibrations to the sound signature of a specific automobile engine. Once on they can remain passive, continuing to collect information, or they can signal a drone to come and investigate or neutralize a target. Since then, the U.S. government has continued to deploy and use a range of drones to hunt and kill those who end up on its kill lists as well as so-called targets of opportunity that happen to be in designated “kill boxes.” The exact number of individuals killed by drones in Yemen is unknown as is the number of civilians killed as a result of these attacks. This is despite the fact that the U.S. government has spent billions of dollars fighting an organization that—at certain points in its history—had fewer than 50 dedicated operatives. It has also forced it to develop a range of countermeasures, including trying to co-opt the AI that drives—at least to some degree—target selection. “They claim that they have let the drones kill some of their rivals,” a Yemen-based analyst explained. “They planted phones in cars that were carrying people they wanted to be eliminated and the drones got them.” While these claims cannot be verified, AQAP knows that data from phones such as voice signatures and numbers are vacuumed up by sensors and fed into the algorithms that—at least partly—help analysts decide whom to target. This data is the equivalent of digital blood spoor for the drones that are hunting them. In another recent video, the emir of AQAP, Qasim al-Raymi, decried his operatives’ inability to refrain from using their phones, claiming that most of the attacks on them over the last two years have been due to the use of cell phones by its operatives. However, given the organization’s expertise with explosives and the increasing availability of military-grade small drones (the UAE and Saudi Arabia are providing these to the forces they support in Yemen) on the black market, it is only a matter of time until they make their debut in Yemen or elsewhere. In the besieged Yemeni city of Taiz where AQAP has been present for at least the last two years, off-the-shelf and modified drones have been used by all sides in the conflict for surveillance. Just as in Iraq and Syria, various groups, including AQAP, are becoming more and more adept at deploying drones with primitive but effective AI. In Taiz and in other parts of Yemen where Houthi rebels and various factions aligned with Saudi Arabia and the United Arab Emirates are fighting for control, semi-autonomous drones are being used to map enemy positions and monitor the movements of rival forces. These drones are programmed to fly to a preselected set of waypoints that, when desired, allows them to move in a grid pattern, thereby providing a comprehensive view of a particular area. This kind of persistent and low-cost surveillance is critical—just as it is on a far larger and more precise scale for the U.S. government’s drones—for determining patterns of life for an individual or group prior to targeting them. While there are no signs that militant groups like AQAP or ISIS are close to employing more advanced AI that would allow them to use drones to identify and target specific individuals, these groups and others will, in a short period, have access to such technology. Russian forces, it is claimed, detected the drones and, through a combination of kinetic and electronic air defense systems, destroyed some of them. Face and gait recognition software and the high-pixel cameras that allow it to function are also widely available and undoubtedly already being used by well-funded non-state actors like Hezbollah. The two-pound device, which was operated by an unidentified federal government employee, was too small to be detected by the radar installed at the White House. While this drone was reportedly being flown by the employee for recreational purposes, its ability to penetrate the airspace around one of the most secure buildings in the nation’s capital proved how vulnerable these sites are to drone-based attacks and surveillance. Apart from the stunning and multifold implications that this technology has for state security, the use of drones has had a more subtle yet profound effect on those who use them. Chamayou struggles to situate drones on a continuum of weapons used to hunt and kill other humans that correlates the proximity between the hunted and the hunter. The most intimate form of killing is hand-to-hand combat and the most distant is the pilot releasing his payload of bombs at thirty-thousand feet or the officer ordering a missile to be launched at some distant target. Yet, just like the launch officer or to a lesser degree the bomber pilot, the drone operator is out of reach. In the case of the U.S. government, he or she is likely thousands of miles away and invulnerable to harm. Yet there is a kind of one-way intimacy between the hunter and the hunted, even though it is pixilated and mediated through screens—enough to unsettle and traumatize many of the soldiers who are charged with operating the drones that hunt and kill in countries like Yemen.

Read More

Cooperative and Collaborative Learning: Student Partnership in Online Classrooms

Cooperative and Collaborative Learning: Student Partnership in Online Classrooms

Cooperative and collaborative learning are not new concepts in the field of education – they have been studied for decades and have been used as classroom practices for much longer than that. Although experts in the field might differentiate between the two, I'd suggest that the subtle differences are not all that important. Cooperative activities are more often utilized in the secondary classroom because the teacher assists in organizing and supervising work, whereas truly collaborative activities require students own the process of learning more independently. Regardless of terminology, we should all agree that as students progress through education they should be presented with frequent and meaningful opportunities to work with and learn from each other. There are many benefits to learning in groups – the Eberly Center for Teaching Excellence & Educational Innovation at Carnegie Mellon University outlines many benefits of group exercises on their website. The list includes development and reinforcement of skills that transcend individual and group exercises, such as: time management, project planning and task management, effective communication, and sharing or receiving feedback on performance. Group activities can be especially challenging in an online classroom where students may live in different states or countries.  Thanks to the work of education leaders and groups like Education Superhighway, improvements in access to infrastructure, devices, and software have made it easier for students to connect with peers around the world. When these roadblocks present themselves, teachers may be tempted to switch to an individualized version of an activity or move away from group activities in the future. When I was teaching in my online classroom in the early 2000s there were times I was tempted to do so myself.  Abandoning collaboration because it isn't easy sends our students the wrong message. What IS important is that the value proposition of each is similar: to create conditions where students gain interpersonal and cognitive skills necessary for work and life. They learn that it might be better to go it alone rather than work together, and the opportunity to build those crucial life-skills might be lost.   Partnership for 21st Century Learning published a research brief entitled "What We Know about Collaboration" that contains valuable information and highlights examples of success. Instead of tossing in the towel, here are some ideas to reflect upon, regardless of whether you are a teacher in an online or face-to-face classroom. This list is not exhaustive and certainly doesn't guarantee a successful group exercise, but the items highlighted below are the result of feedback from our students, teachers, and curriculum staff that have worked in cohort-based online classrooms for the past 20 years. They are the benchmarks used by our curriculum team as we discuss online group experiences and are an excellent starting point for any educator interested in enhancing the quality of cooperative or collaborative activities in their classroom. Do students have adequate time to get to establish closer working relationships and achieve the goals of the activities? Is the work contextualized so that students understand the value of the learning? For those with an interest in the language of education, the distinction between the two practices is primarily in the ownership of the learning process, although there are some who believe that cooperation and collaboration are essentially the same practice because they "overlap in their typical characteristics (i.e. shared knowledge and authority, socially co-constructed knowledge through peer interactions) and long-term goals which help students learn by working together on substantive issues". When specified, the major differences between cooperation and collaboration are with the role of the instructor in the process and the degree to which the community develops valued and shared vision. Students work individually and together and are accountable to the group for the overall success of the activity.

Read More

Spend what’s left after saving, not vice versa’

Spend what’s left after saving, not vice versa’

With such a tough upbringing, it is no wonder entrepreneur Teh Li Rong has never been intimidated by hostile workplaces or the risk of going it alone in business. "Family lifestyle was simple and thrifty," the 33-year-old recalled. "Back then, my grandmother and auntie helped out with some of our education expenses. A Always spend within your limit, and make your spending decisions within your capacity and face reality. When an investment looks too good to be true, it probably is. Do not invest until there is appropriate investor protection in place and you have done your due diligence. I have seen how much hard work he put in and eventually succeeded, and gave back to society by providing jobs. That drive to achieve financial independence led her into a career as a derivatives trader after graduating with a bachelor's degree in business management from Singapore Management University in 2007. A I used to be much more aggressive in my investing style during my earlier days as I was young and single then. I had more time to sit in front of the screen, analysing, executing and monitoring my trades. Now that I have a family with two children, I adopt a much more conservative and longer-term trading approach. As a trader is experiencing swings every day, if the heart is not strong, the trader will not last. Maintaining a positive attitude and focus, together with equal doses of perseverance, patience, hard work and guts, have led me to success. I save about 50 per cent of my monthly income and channel it into other investments and savings towards asset classes that have lower yield but are more stable. Ms Teh navigated the intimidating male-dominated environment and became adept at trading via instruments like commodity and index futures and options. A The spirit of JiojioMe is embedded in the name itself as "Jio" literally means "to ask or invite someone out" in Hokkien dialect. The app has partnered with more than 600 merchants and establishments to give users exclusive promotions and perks at their fingertips, and we target to grow to 4,000 partners with 1.5 million users by this year. With JiojioMe, you can post the activity and the app will help you find like-minded people to join you. With more offline interactions, this creates a win-win situation for users as well as the merchants and establishments. JiojioMe was officially launched here and is expanding to Malaysia, Thailand, Indonesia, Japan and South Korea. She also became skilful in developing and executing complex strategies involving systematic and high-frequency trading, arbitraging and directional trading. A My portfolio is 35 per cent in properties, 30 per cent in liquid funds, 5 per cent in equities, 10 per cent in businesses, and 20 per cent in managed funds and insurance. Actively investing into JiojioMe and looking out for other businesses to invest in and also managing funds from my own trading. But Ms Teh had bigger goals in mind and struck out on her own in 2011 to found Star Financials with a six-figure sum. A I do not intend to retire and will continue to earn both active and passive income to sustain my current lifestyle and for my kids' education. I plan to work as long as I can and set up more foundations to help others and contribute to society. I was sitting on a profit of about $45,000 but I did not realise the profit. As well as being a trading firm, it also trains and nurtures people who have no prior experience in the field. On the first trading day of 2016, there was a huge spike in volatility when the Chinese market experienced a sharp sell-off that quickly sent stocks tumbling globally. The Chinese market fell to the point of triggering its new trading curb rule and trading was halted when it reached a certain threshold. A surprise move from the Chinese authorities, which suspended the circuit breaker, continued to cause further panic in the markets. No matter how volatile markets are, a non-negotiable risk management and money management framework has to be in place and a trader has to react to minimise losses. I managed to recoup the losses in the following month. The purchase price was US$2.30 apiece and the counter rose to US$62 over a four-year period. I was 25 years old then and did not have the risk appetite to buy more shares during the crisis. This happened in 2010 when I was trading the Japanese index futures using the funds of the company. Back then, the Greek sovereign debt crisis had begun in late 2009 and the market was very volatile with huge liquidity. I not only profited from the price of the underlying asset surging, but as a scalper and momentum trader trading derivatives, I also profited from both selling and buying. Ms Teh and her 44-year-old husband, a senior director at a local bank, have two children, four-year-old Arissa and two-month-old Allen.

Read More

Berlin First Look: Salma Hayek in ‘The Hummingbird Project’ (Exclusive)

Berlin First Look: Salma Hayek in ‘The Hummingbird Project’ (Exclusive)

Salma Hayek and Alexander Skarsgard in 'The Hummingbird Project' Salma Hayek plays the nemesis to Alexander Skarsgard’s bald high-frequency trading scammer in this exclusive first look at the actress in The Hummingbird Project, from director Kim Nguyen (War Witch, Two Lovers and a Bear). HanWay is showing first footage from the thriller in Berlin. Written by Nguyen, The Hummingbird Project sees Eisenberg and Skarsgard play cousins who inhabit the high-stakes world of high-frequency trading and hatch a multimillion-dollar plan that involves plenty of danger if they fail. The film — which was first introduced to buyers in Cannes by HanWay, which reps international sales (CAA is overseeing the U.S.) — is being produced by Pierre Even (War Witch, C.R.A.Z.Y., Brooklyn) of Item 7 in Montreal and co-produced with Belgian outfit Belga Films, with Brian Kavanaugh-Jones (Loving) and Fred Berger (La La Land) of Automatik as executive producers.

Read More

The Chinese overseas shopping spree slowed in 2017, but technology buying remains active

The Chinese overseas shopping spree slowed in 2017, but technology buying remains active

China’s overseas investments declined sharply last year, but technology related buying remained active, as the country pursued a more strategic approach and encouraged companies to purchase advanced technologies in the West. While some previously aggressive buyers have retreated, Tencent Holdings and Alibaba Group Holding continue to lead the shopping spree, and have spent billions on buying overseas companies and their technology, which ranges from artificial intelligence and cryptocurrencies to electric cars and genetic engineering. The investment in technology comes as the Chinese government encourages domestic companies to buy Western technologies when they “go out”, and promises it will facilitate the process. According to data from ITJUZI.com, outbound telecommunication, media and technology investment mostly went to three areas last year: corporate services, such as big data, cloud computing, and AI; fintech, such as blockchain and cryptocurrencies; and health care, like genetic engineering, biotech, and new drug research. A recent survey by PwC found that the search for advanced technologies meant that developed markets in the US and Europe remained the biggest destinations for Chinese buyers. Despite the much publicised impact of increased scrutiny by US regulators and the Trump administration’s ambivalent attitude towards Chinese investment, the number of deals in the US actually increased slightly to a record of 221 in 2017, according to the PwC report. “Chinese companies can’t invest overseas without government support,” said Liu Lixi, an analyst for Northeast Securities. “I think the selected tightening will persist for some time as the government faces pressure to stabilise the economy and the exchange rate. In 2017, Chinese companies’ outbound investments and M&As in the telecommunication, media and technology sectors totalled 395 billion yuan (US$62.28 billion), according to Chinese data compiler ITJUZI.com, based on publicly available information.

Read More
1 2 3 18