Zicutake USA Comment | Search Articles

#History (Education) #Satellite report #Arkansas #Tech #Poker #Language and Life #Critics Cinema #Scientific #Hollywood #Future #Conspiracy #Curiosity #Washington
 Smiley face
 SYFY TV online Free


[Calculate SHA256 hash]
 Smiley face
 Smiley face Encryption Text and HTML
Aspect Ratio Calculator
[HTML color codes]
 Smiley face Conversion to JavaScript
[download YouTube videos in MP4, FLV, 3GP, and many more formats]

 Smiley face Mining Satoshi | Payment speed

 Smiley face
Online BitTorrent Magnet Link Generator


#Education Articles University

#Education Articles University

Talking points: MIT Sloan Sports Analytics Conference explores data and how to share it

Posted: 26 Feb 2018 01:30 PM PST

At the Super Bowl this February, the Philadelphia Eagles pulled off the game's signature play: a touchdown pass off a reverse, with tight end Trey Burton throwing to quarterback Nick Foles. That play, which helped upset the New England Patriots, is called the "Philly Special." And from a sports-analytics viewpoint, the play was special, all right — because it came on a fourth-down situation.

For years, data-driven football analysts have been saying that on fourth downs, teams should go for first downs (or touchdowns) more frequently than they typically do, rather than punting or kicking field goals. This season, the Eagles tried to convert fourth downs more than any team since the 2008 Patriots — and it paid off. Indeed, Eagles coach Doug Pedersen had an extra assistant coach talking into his headset during games all year, just to inform those kinds of decisions.

The Eagles' real-time sideline discussions fit nicely with the content of 12th annual MIT Sloan Sports Analytics Conference (SSAC), held Feb. 23-24; the conference's theme, "Talk Data to Me," and many of its panels, underscored the importance of communication in sports analytics. 

After all, since pioneering analyst Bill James' popular annual "Baseball Abstract" books began mainstream publication in 1982, some fans have known that a lot of conventional wisdom in sports does not add up. But communicating that to executives, coaches, and players has never been easy. Still, this year SSAC was brimming with speakers, many with championship resumes, attesting to the progress in the field.

"We try to be data-driven and model-driven, but when we get that data, we try to talk about it as a group," said Brandon Taubman, the senior director of baseball operations and analytics at the 2017 World Series-winning Houston Astros, speaking at a panel on baseball data.

"You have to listen," said Nick Caserio, director of player personnel for the Patriots, at a football panel. "It's not about where [an] idea comes from or how it gets to you. If it's a good idea, it's good."

"Our culture is one of information," offered John Chayka, general manager of the Arizona Coyotes of the National Hockey League, at a panel on hockey analytics.

President in the house

First held in 2007, the SSAC was founded by Daryl Morey SM '05, general manager of the Houston Rockets (who currently have the best record in the NBA), and Jessica Gelman, CEO of the Kraft Analytics Group

This year's edition, held at the Boston Convention and Exhibition Center, was the biggest ever, with 3,500 attendees from 35 countries and 46 U.S. states, representing over 200 universities and roughly 600 companies. The event also featured 37 conference panels, 33 "competitive advantage" talks on research or new products, a research paper competition — and one former U.S. president.

That would be Barack Obama, who spoke on Friday afternoon, in an hour-long on-stage conversation with Morey and Gelman. Obama's remarks, made before a capacity audience, were off the record. 

Obama is well-known as a sports fan, and in keeping with the idea of deploying data whenever possible, Morey and Gelman showed a slide during their Friday morning welcoming remarks underlining his basketball acumen: Obama's NCAA tournament picks, which he publicly unveiled every year during his presidency, have been more accurate than those of Morey and Gelman.

Why players want data: "It justifies the hard work"  

The classic sports communication problem about data has involved scouts, on the one hand, who evaluate players in person, and analysts, on the other, who look at large data sets to avoid observational biases. But a lot of team executives say that linking the two groups is a common practice now. 

"It's incredibly different," said Jerry DiPoto, general manager of baseball's Seattle Mariners, speaking of the evolving relationship between old baseball hands and front-office number crunchers. "Ten years ago they would just sit across the table and throw fruit at each other. … Now it is a bigger and more inclusive group." 

Or, as Taubman put it: "We value good scouts, and we value good analytics."

In some ways, sharing data with players can be trickier, since it involves condensing statistics down to a few useful data points, and not forcing players to rethink their well-honed instincts. But sometimes the best players want the most data — like hockey superstar Sidney Crosby of the two-time defending NHL champion Pittsburgh Penguins.

"Sid right now is at another level in terms of the information he wants to get," said former Penguins coach Dan Bylsma, who steered the team to the 2009 Stanley Cup trophy. Crosby, added Bylsma, would even look at video in between periods of games to analyze how his possessions unfolded.

"But not every player is like that," Bylsma added.

Chris Bosh, a former NBA all-star who appeared on multiple SSAC panels, won two titles with the analytically minded Miami Heat. Even so, Bosh said, he would have preferred being given more defensive data by his coaches. 

"It justifies the hard work," Bosh said.

Percentage basketball, grown organically

One reason the use of analytics in sports may be getting easier is the growing realization that new statistics reinforce traditional values. In basketball, many old-school coaches emphasized winning through defense, rebounding, avoiding turnovers, and having good shot-selection on offense.

Modern basketball analytics puts a premium on those concepts as well, since they all help a team take more high-percentage shots than its opponent, and thus tilt the odds of victory in its favor.

What has changed is the contemporary conception of what a good shot is, and how to obtain it. Teams in the NBA take far more three-point shots than ever now, and space players on the floor more intentionally in an effort to create more space for shooters.

As the star point guard of highly influential Phoenix Suns for several years starting in the 2004-2005 season, Steve Nash helped create this change as much as anyone. Nash won two MVP awards as the Suns increased the tempo, stretched the court, and created a new template for offense. But did this style develop with numbers in mind? Not exactly, said Nash on Saturday, in his first SSAC appearance.

"A lot of what happened in Phoenix was pretty organic," Nash said, adding that the team's style started to crystallize during pickup games the players held in the fall of 2004. "It wasn't necessarily predetermined."

But it was, organically, a high-percentage style, something Nash's coach at the time, Mike D'Antoni — now coach of Morey's Rockets — quickly realized.

"I think a lot of coaches would have found a way to stop that [style] and find validation in organizing something that didn't need to be organized," Nash added.

So for all the talk at SSAC about getting players and old-school coaches on board with new thinking, Nash's example provides a valuable additional lesson: Everyone needs to be open-minded about the unpredictable evolution of sports. That may be something to talk about at next year's SSAC.

Study reveals why polymer stents failed

Posted: 26 Feb 2018 12:00 PM PST

Many patients with heart disease have a metal stent implanted to keep their coronary artery open and prevent blood clotting that can lead to heart attacks. One drawback to these stents is that long-term use can eventually damage the artery.

Several years ago, in hopes of overcoming that issue, a new type of stent made from biodegradable polymers was introduced. Stent designers hoped that these devices would eventually be absorbed by the blood vessel walls, removing the risk of long-term implantation. At first, these stents appeared to be working well in patients, but after a few years these patients experienced more heart attacks than patients with metal stents, and the polymer stents were taken off the market.

MIT researchers in the Institute for Medical Engineering and Science and the Department of Materials Science and Engineering have now discovered why these stents failed. Their study also reveals why the problems were not uncovered during the development process: The evaluation procedures, which were based on those used for metal stents, were not well-suited to evaluating polymer stents.

"People have been evaluating polymer materials as if they were metals, but metals and polymers don't behave the same way," says Elazer Edelman, the Thomas D. and Virginia W. Cabot Professor of Health Sciences and Technology at MIT. "People were looking at the wrong metrics, they were looking at the wrong timescales, and they didn't have the right tools."

The researchers hope that their work will lead to a new approach to designing and evaluating polymer stents and other types of degradable medical devices.

"When we use polymers to make these devices, we need to start thinking about how the fabrication techniques will affect the microstructure, and how the microstructure will affect the device performance," says lead author Pei-Jiang Wang, a Boston University graduate student who is doing hid PhD thesis with Edelman.

Edelman is the senior author of the paper, which appears in the Proceedings of the National Academy of Sciences the week of Feb. 26. Other authors include MIT research scientist Nicola Ferralis, MIT professor of materials science and engineering Jeffrey Grossman, and National University of Ireland Galway professor of engineering Claire Conway.

Microstructural flaws

The degradable stents are made from a polymer called poly-l-lactic acid (pLLA), which is also used in dissolvable sutures. Preclinical testing (studies done in the lab and with animal models) did not reveal any cause for concern. In human patients the stents appeared stable for the first year, but then problems began to arise. After three years, over 10 percent of patients had experienced a heart attack, including fatal heart attacks, or had to go through another medical intervention. That is double the rate seen in patients with metal stents.

After the stents were taken off the market, the team decided to try to figure out if there were any warning signs that could have been detected earlier. To do this, they used Raman spectroscopy to analyze the microstructure of the stents. This technique, which uses light to measure energy shifts in molecular vibrations, offers detailed information about the chemical composition of a material. Ferralis and Grossman modified and optimized the technique for studying stents.

The researchers found that at the microscopic level, polymer stents have a heterogeneous structure that eventually leads to structural collapse. While the outer layers of the stent have a smooth crystalline structure made of highly aligned polymers, the inner core tends to have a less ordered structure. When the stent is inflated, these regions are disrupted, potentially causing early loss of integrity in parts of the structure.

"Because the nonuniform degradation will cause certain locations to degrade faster, it will promote large deformations, potentially causing flow disruption," Wang says.

When the stents become deformed, they can block blood flow, leading to clotting and potentially heart attacks. The researchers believe that the information they gained in this study could help stent designers come up with alternative approaches to fabricating stents, allowing them to possibly eliminate some of the structural irregularities.

A silent problem

Another reason that these problems weren't detected earlier, according to the researchers, is that many preclinical tests were conducted for only about six months. During this time, the polymer devices were beginning to degrade at the microscopic level, but these flaws couldn't be detected with the tools scientists were using to analyze them. Visible deformations did not appear until much later.

"In this period of time, they don't visibly erode. The problem is silent," Edelman says. "But by the end of three years, there's a huge problem."

The researchers believe that their new method for analyzing the device's microstructure could help scientists better evaluate new stents as well as other types of degradable polymer devices.

"This method provides a tool that allows you to look at a metric that very early on tells you something about what will happen much later," Edelman says. "If you know about potential issues in advance, you can have a better idea of where to look in animal models and clinical models for safety issues."

The research was funded by Boston Scientific Corporation and the National Institutes of Health.

Seeing the brain's electrical activity

Posted: 26 Feb 2018 08:00 AM PST

Neurons in the brain communicate via rapid electrical impulses that allow the brain to coordinate behavior, sensation, thoughts, and emotion. Scientists who want to study this electrical activity usually measure these signals with electrodes inserted into the brain, a task that is notoriously difficult and time-consuming.

MIT researchers have now come up with a completely different approach to measuring electrical activity in the brain, which they believe will prove much easier and more informative. They have developed a light-sensitive protein that can be embedded into neuron membranes, where it emits a fluorescent signal that indicates how much voltage a particular cell is experiencing. This could allow scientists to study how neurons behave, millisecond by millisecond, as the brain performs a particular function.

"If you put an electrode in the brain, it's like trying to understand a phone conversation by hearing only one person talk," says Edward Boyden, an associate professor of biological engineering and brain and cognitive sciences at MIT. "Now we can record the neural activity of many cells in a neural circuit and hear them as they talk to each other."

Boyden, who is also a member of MIT's Media Lab, McGovern Institute for Brain Research, and Koch Institute for Integrative Cancer Research, and an HHMI-Simons Faculty Scholar, is the senior author of the study, which appears in the Feb. 26 issue of Nature Chemical Biology. The paper's lead authors are MIT postdocs Kiryl Piatkevich and Erica Jung.

Imaging voltage

For the past two decades, scientists have sought a way to monitor electrical activity in the brain through imaging instead of recording with electrodes. Finding fluorescent molecules that can be used for this kind of imaging has been difficult; not only do the proteins have to be very sensitive to changes in voltage, they must also respond quickly and be resistant to photobleaching (fading that can be caused by exposure to light).

Boyden and his colleagues came up with a new strategy for finding a molecule that would fulfill everything on this wish list: They built a robot that could screen millions of proteins, generated through a process called directed protein evolution, for the traits they wanted.

"You take a gene, then you make millions and millions of mutant genes, and finally you pick the ones that work the best," Boyden says. "That's the way that evolution works in nature, but now we're doing it in the lab with robots so we can pick out the genes with the properties we want."

The researchers made 1.5 million mutated versions of a light-sensitive protein called QuasAr2, which was previously engineered by Adam Cohen's lab at Harvard University. (That work, in turn, was based on the molecule Arch, which the Boyden lab reported in 2010.) The researchers put each of those genes into mammalian cells (one mutant per cell), then grew the cells in lab dishes and used an automated microscope to take pictures of the cells. The robot was able to identify cells with proteins that met the criteria the researchers were looking for, the most important being the protein's location within the cell and its brightness.

The research team then selected five of the best candidates and did another round of mutation, generating 8 million new candidates. The robot picked out the seven best of these, which the researchers then narrowed down to one top performer, which they called Archon1.

Mapping the brain

A key feature of Archon1 is that once the gene is delivered into a cell, the Archon1 protein embeds itself into the cell membrane, which is the best place to obtain an accurate measurement of a cell's voltage.

Using this protein, the researchers were able to measure electrical activity in mouse brain tissue, as well as in brain cells of zebrafish larvae and the worm Caenorhabditis elegans. The latter two organisms are transparent, so it is easy to expose them to light and image the resulting fluorescence. When the cells are exposed to a certain wavelength of reddish-orange light, the protein sensor emits a longer wavelength of red light, and the brightness of the light corresponds to the voltage of that cell at a given moment in time.

The researchers also showed that Archon1 can be used in conjunction with light-sensitive proteins that are commonly used to silence or stimulate neuron activity — these are known as optogenetic proteins — as long as those proteins respond to colors other than red. In experiments with C. elegans, the researchers demonstrated that they could stimulate one neuron using blue light and then use Archon1 to measure the resulting effect in neurons that receive input from that cell.

Cohen, the Harvard professor who developed the predecessor to Archon1, says the new MIT protein brings scientists closer to the goal of imaging millisecond-timescale electrical activity in live brains.

"Traditionally, it has been excruciatingly labor-intensive to engineer fluorescent voltage indicators, because each mutant had to be cloned individually and then tested through a slow, manual patch-clamp electrophysiology measurement. The Boyden lab developed a very clever high-throughput screening approach to this problem," says Cohen, who was not involved in this study. "Their new reporter looks really great in fish and worms and in brain slices. I'm eager to try it in my lab."

The researchers are now working on using this technology to measure brain activity in mice as they perform various tasks, which Boyden believes should allow them to map neural circuits and discover how they produce specific behaviors.

"We will be able to watch a neural computation happen," he says. "Over the next five years or so we're going to try to solve some small brain circuits completely. Such results might take a step toward understanding what a thought or a feeling actually is."

The research was funded by the HHMI-Simons Faculty Scholars Program, the IET Harvey Prize, the MIT Media Lab, the New York Stem Cell Foundation Robertson Award, the Open Philanthropy Project, John Doerr, the Human Frontier Science Program, the Department of Defense, the National Science Foundation, and the National Institutes of Health, including an NIH Director's Pioneer Award.