Translation Page | USAComment.com
USAComment.com
Zicutake USA Comment | Search Articles



#History (Education) #Satellite report #Arkansas #Tech #Poker #Language and Life #Critics Cinema #Scientific #Hollywood #Future #Conspiracy #Curiosity #Washington
 Smiley face
PROXY LIST
 Smiley face  Smiley face  Smiley face 
 Smiley face 
 Smiley face JOURNAL WORLD:

SEARCH +8 MILLIONS OF LINKS ZICUTAKE STATE

#Discussion

#Discussion


Amazon, Liberty Global Order ‘The Feed’ From ‘Walking Dead’ Writer Channing Powell

Posted: 08 Feb 2018 06:07 AM PST

Amazon and Liberty Global have ordered “The Feed,” a London-set drama about the family of a man who invents a brain implant that allows people to share thoughts and emotions. Things take a turn for the worse when people with “the Feed” technology in their heads start to become murderous, and the family struggles to control the situation.

Amazon Prime Video will run the series, which is based on a novel by Nick Clark Windo, as an original in North America and Latin America. Liberty Global will show it on its cable platforms as part of a move into original programming, which has already yielded “The Rook.” It will play on Liberty Global’s Virgin Media in the U.K.

Channing Powell (“The Walking Dead”) will write the series, and All3Media‘s Studio Lambert will produce. The production company was formed by Stephen Lambert (pictured), who is known for creating unscripted formats. This marks his move into scripted. All3Media is owned by Liberty and Discovery, and its distribution arm will sell “The Feed” outside of the Amazon and Liberty Global territories.

“We want large-scale, ambitious shows about contemporary ideas that make a global impact and get people talking,” Liberty Global chief programming officer Bruce Mann said. “And so we were delighted Stephen and Susan brought us such a bold and thought-provoking series.”

“‘The Feed’ has an incredibly provocative story that will challenge and entertain our customers,” added Brad Beale, vice president of Worldwide TV Content Acquisition for Amazon Prime Video. “Channing Powell has an amazing track record captivating audiences globally, and we’re excited to be collaborating with her on this project.”

“We are all aware of our addiction to social media and technology, our fear of what it is doing to our brains and our terror of what would happen if we had to live without it,” said Lambert, Studio Lambert’s chief executive. “These are the core themes of ‘The Feed.'”

Primary Wave Music Promotes Seth Faber and Donna Grecco

Posted: 08 Feb 2018 06:00 AM PST

Primary Wave Music announced today the promotion of Seth Faber to SVP of Marketing and Donna Grecco to Vice President of Marketing.

A nine-year veteran and partner of Primary Wave, Faber will be responsible for generating “creative and remunerative opportunities” for the company’s song catalogues, according to a press release. He will focus on brand licensing, A&R exploitation and original content creation. Faber has held a variety of duties during his tenure at Primary Wave. As Vice President of Artist Development & Senior Artist Manager, he co-managed the career of Grammy nominated DJ/Producer Audien and signed the singer/songwriter Foy Vance (Ed Sheeran, Miranda Lambert, Rag & Bone Man) to a co-publishing agreement. Previously he held A&R and artist development posts at Island and J Records. Faber will continue to be based in New York, and will report to Adam Lowenberg and Jeff Straughn, Primary Wave’s Heads of Marketing and Branding, respectively.

“We are excited to have Seth’s bold ingenuity and unbridled passion for music in our arsenal, as we continue to acquire and market many of the greatest songs of all-time,” said Lowenberg of the promotion.

Grecco, who will work alongside Faber and also report to Straughn and Lowenberg, will be responsible for cultivating and expanding Primary Wave’s relationships with the brand community, including identifying brand partnership opportunities the company’s publishing and management roster.

Prior to joining BSG, Donna led the marketing and branding team for Vida Brands were she designed all brand strategies which included creating a celebrity brand ambassador program, producing all photo and video shoots, as well as producing all trade shows and retail marketing strategies. Before Vida Brands, Grecco owned and operated her own event marketing and tradeshow management company for over 10 years, Design Lab, LTD.

“We are thrilled to have Donna stepping into this expanded role, where her undeniable marketing expertise will continue to be a priceless asset to our company,” noted Straughn.

 

The Two-Degree Delusion

Posted: 08 Feb 2018 05:47 AM PST

Global carbon emissions rose again in 2017, disappointing hopes that the previous three years of near zero growth marked an inflection point in the fight against climate change. Advocates of renewable energy had attributed flat emissions to the falling cost of solar panels. Energy efficiency devotees had seen in the pause proof that economic activity had been decoupled from energy consumption. Advocates of fossil fuel divestment had posited that the carbon bubble had finally burst.

Analysts who had attributed the pause to slower economic growth in a number of parts of the world, especially China, were closer to the truth. The underlying fundamentals of the energy economy, after all, remained mostly unchanged-there had been no step change in either the energy efficiency of the global economy or the share of energy production that clean energy accounted for. And sure enough, as growth picked up, emissions started to tick back up again as well.

Even during the pause, it was clear that the world wasn’t making much progress toward avoiding significant future climate change. To significantly alter the trajectory of sea level changes or most other climate impacts in this century or the next, emissions would not just have to peak; they would have to fall precipitously. Yet what progress the world has made to cut global emissions has been, under even the most generous assumptions, incremental.

But at the latest climate talks in Bonn last fall, diplomats once again ratified a long-standing international target of limiting warming to two degrees Celsius above preindustrial levels. This despite being unable to commit to much beyond what was already agreed at the Paris meeting two years ago, when negotiators reached a nominal agreement on nonbinding Intended Nationally Determined Contributions, which would result in temperatures surpassing three degrees above preindustrial levels before the end of this century.

Forty years after it was first proposed, the two-degree target continues to maintain a talismanic hold over global efforts to address climate change, despite the fact that virtually all sober analyses conclude that the target is now unobtainable. Some advocates still insist that with sufficient political will, the target can be met. Others recognize that although the goal is practically unachievable, it represents an aspiration that might motivate the world to reduce emissions further and faster than it would otherwise. For still others, the target remains within reach if everyone gets serious about removing carbon from the atmosphere or hacking the atmosphere in order to buy more time.

But it is worth considering the consequences of continuing to pursue a goal that is no longer obtainable. Some significant level of future climate impact is probably unavoidable. Sustaining the fiction that the two-degree target remains viable risks leaving the world ill prepared to mitigate or manage the consequences.

AN ARBITRARY TARGET

My uncle, the Yale University economist William Nordhaus, is widely credited with being the first person to propose that climate policy should strive to limit anthropogenic global warming to two degrees above preindustrial temperatures. He didn’t arrive at that conclusion through any sort of elaborate climate modeling or cost-benefit analysis. Rather, he considered the very limited evidence of long-term climate variance available at that time and concluded that a two-degree increase would take global temperatures outside the range experienced by human societies for the previous several thousand years and probably much longer. The standard was, by his own admission, arbitrary.

In the decades that followed, the international community formalized his target through a series of UN conferences, assessments, and negotiations. Climate researchers, meanwhile, have backfilled the target with science, some of it compelling. It does indeed appear that the earth is already hotter than it has been in the last several hundred thousand years, with temperatures likely to rise substantially more through this century and well beyond.

But limiting global temperatures below two degrees provides no guarantee that the world will avoid catastrophe, nor does exceeding that threshold assure it. No one knows with much precision what the relationship will be between global temperature and the impact of climate change at local and regional levels. Nor do we have a particularly good handle on the capability of human societies to adapt to those impacts.

In reality, most of the climate risks that we understand reasonably well are linear, meaning that lower emissions bring a lower global temperature increase, which in turn brings lower risk. That is the case for impacts such as sea level rise, agricultural yields, rainfall, and drought. Stabilizing emissions at 450 atmospheric parts per million brings less risk than stabilizing at 500, 500 brings less risk than 550, and so on. The world isn’t saved should we limit atmospheric concentrations to 450 parts per million, nor is it lost should concentrations surpass that threshold.

There are a range of potential nonlinear tipping points that could also bring catastrophic climate impacts. Many climate scientists and advocates argue that the risks associated with triggering these impacts are so great that it is better to take a strict precautionary approach to dramatically cut emissions. But there are enormous uncertainties about where those tipping points actually are. The precautionary principle holds equally well at one degree of warming, a threshold that we have already surpassed; one and a half degrees, which we will soon surpass; or, for that matter, three degrees.

Such calculations are further complicated by the substantial lag between when we emit carbon and when we experience the climate impacts of doing so: because of the time lag, and because of the substantial amount of carbon already emitted (atmospheric concentrations of carbon today stand at 407 parts per million, versus 275 prior to the start of the Industrial Revolution), even an extreme precautionary approach that ended all greenhouse gas emissions immediately would not much affect the trajectory of global temperatures or climate impacts until late in this century at the earliest.

Projections of sea level rise, for instance, don’t really diverge in high-emissions versus low-emissions scenarios until late in this century, and even then not by very much. It is not until modelers project into the twenty-second century that large differences begin to emerge. The same is true of most other climate impacts, at least as far as we understand them.

Many advocates for climate action suggest that we are already experiencing the impacts of anthropogenic climate change in the form of more extreme weather and natural disasters. Insofar as this is true-and the effect of climate change on present-day weather disasters is highly contested-there is not much we can do to mitigate it in the coming decades. 

THE URGENCY TO ADAPT

Over the last two decades, discussions of climate risk have been strongly influenced by concerns about moral hazard. The suggestion that human societies might successfully adapt to climate change, the argument goes, risks undermining commitments to cut emissions sufficiently to avoid those risks.

But moral hazard runs the other way as well. On a planet that is almost certainly going to be much hotter even if the world cuts emissions rapidly, the continuing insistence that human societies might cut emissions rapidly enough to avoid dangerous climate change risks undermining the urgency to adapt.

Adaptation brings difficult tradeoffs that many climate advocates would prefer to ignore. Individual and societal wealth, infrastructure, mobility, and economic integration are the primary determinants of how vulnerable human societies are to climate disasters. A natural disaster of the same magnitude will generally bring dramatically greater suffering in a poor country than in a rich one. For this reason, poor nations will bear the brunt of climate impacts. But by the same token, the faster those nations develop, the more resilient they will be to climate change. Development in most parts of the world, however, still entails burning more fossil fuels-in most cases, a lot more.

Most climate advocates have accepted that some form of adaptation will be a necessity for human societies over the course of this century. But many refuse to acknowledge that much of that adjustment will need to be powered by fossil fuels. Hard infrastructure-modern housing, transportation networks, and the like-is what makes people resilient to climate and other natural disasters. That sort of infrastructure requires steel and concrete. And there are presently few economically viable ways to produce steel or concrete without fossil fuels.

The two-degree threshold, and the various carbon budgets and emissions reduction targets that accompany it, has provided the justification for prohibitions at the World Bank and other international development institutions on finance for fossil fuel development. Given how much climate change is likely already built into our future owing to past emissions and how long it takes for emissions reductions to mitigate climate impacts, those sorts of policies will almost certainly increase exposure to climate hazards for many people in developing economies.

DEPLOYMENT DELUSIONS

Continued devotion to the two-degree target has also undermined carbon-cutting efforts. In theory, cutting emissions deeply enough by midcentury to limit warming to two degrees would require deploying zero-carbon energy technologies today at a historically unprecedented scale. That would seem to take important drivers of incremental decarbonization, such as the transition from coal to gas in the United States and many other parts of the world, off the table. Burning natural gas produces half the carbon per unit of energy produced as burning coal. But it can’t decarbonize the power sector fast enough to hit the two-degree target by 2050.

For this reason, most climate advocates are at best indifferent to natural gas and are more often opposed, even though the switch from coal to natural gas has been the largest source of emissions reductions in the United States for over a decade, as it was in the United Kingdom in the early 1990s.

The two-degree target has also hobbled support for developing better clean energy technologies. Because next-generation technologies such as advanced nuclear reactors, advanced geothermal, and carbon capture capabilities won’t be ready for large-scale commercialization for at least another decade or two, they will arrive too late to contribute much to two-degree stabilization scenarios. In turn, many prominent climate advocates have long argued that the only climate action worthy of the name entails deploying zero-carbon technologies that are commercially available today.

Yet there is little reason to think that existing zero-carbon technologies are up to the job. To be sure, some models do claim that current renewable energy technologies are capable of powering the electrical grid and much beyond. But strong renewables growth in various parts of the world appears to follow a classic S-curve, with market share on electrical grids stalling at around 20 percent or less of total generation after a period of strong initial adoption, because the value of intermittent sources of energy such as wind and solar declines precipitously as their share of electricity production rises.

For a period of time, in the 1970s and 1980s, conventional nuclear reactors had a better track record. France decarbonized 80 percent of its electrical system with nuclear. Sweden achieved 50 percent. But conventional nuclear technology, which requires strong central governments and vertically integrated utilities that build, own, and operate plants, has been swimming against the current of economic liberalization and declining faith in technocratic institutions for decades. Outside of China and a few other Asian economies, few nations have been able to build large nuclear plants cost-effectively in recent decades.

Such limitations continue to plague power sector decarbonization efforts around the world. But the power sector accounts for only about 20 percent of global primary energy use and turns out to be relatively easy to decarbonize compared with transportation, agriculture, industry, and construction. There are currently few viable substitutes for fossil fuels in the production of steel, cement, or fertilizer or for powering aviation and heavy transportation.

Longer term, there may be better options, including advanced nuclear reactors that can provide heat for industrial processes, carbon capture technologies that can capture emissions from burning fossil fuels, and low-carbon synthetic fuels that might substitute for diesel and aviation fuels. But all are decades away from viable application. The technologies that are needed to cut emissions deeply enough to stabilize emissions at two degrees, in short, will not be ready in time to do so. As a result, continued devotion to the two-degree threshold has ended up undermining both important incremental pathways to lower emissions and long-term investment in the development and commercialization of technologies that would be necessary to deeply decarbonize the global economy.

POINT OF NO RETURN

Almost 30 years after the UN established the two-degree threshold, over 80 percent of the world’s energy still comes from fossil fuels, a share that has remained largely unchanged since the early 1990s. Global emissions and atmospheric concentrations of carbon dioxide continue to rise. Climate policy, at both international and national levels, has had little impact on their trajectory.

Climate advocates have persistently blamed the failures of climate policy on the corrupting political power of the fossil fuel industry. Industry-funded “merchants of doubt,” as the historians Naomi Oreskes and Erik Conway originally dubbed them, together with heavy political spending, have stopped climate mitigation efforts in their tracks. But those claims are U.S.-centric. Climate skepticism and denial have not found anywhere close to the same level of political traction outside the United States. Exxon and the Koch brothers have no political franchise in the German Bundestag, the Chinese Central Committee, or most other places outside Washington. And yet those nations have had no more success cutting emissions than has the United States. To the contrary, U.S. emissions have fallen faster than those of almost any other major economy over the last decade.

The alternate explanation is rather less dramatic. Decarbonization is hard. Fossil fuels continue to bring substantial benefit to most people around the world, despite the significant environmental consequences. The alternatives have improved, but not sufficiently to displace fossil energy at scales that would be consistent with stabilizing temperatures at the two-degree threshold. The consequences of failing to do so for human societies are too uncertain or too far off in the future to motivate either a World War II-style mobilization to deploy renewable energy or a global price on carbon high enough to rapidly cut emissions.

At some point over the next 20 years or so, atmospheric concentrations of carbon will almost certainly surpass 450 parts per million, the emissions proxy for avoiding long-term temperature increases of greater than two degrees. At that point, the only certain path to stay under the target will be either to pull carbon out of the atmosphere at almost unimaginable scales or to alter the chemistry of the atmosphere such that rising greenhouse gas concentrations do not lead to higher temperatures. Functionally, that moment has already arrived. Virtually all scenarios consistent with stabilizing global temperatures at plus two degrees, according to the Intergovernmental Panel on Climate Change, explicitly require so-called negative emissions in the latter half of this century.

In recent years, the moral hazard argument used against adaptation has also been used against geoengineering and carbon removal technologies. The suggestion that it might be possible to pull sufficient carbon out of the atmosphere to lower global temperatures or, short of that, change the chemical composition of the atmosphere or the oceans such that large temperature increases might be forestalled, the logic goes, risks distracting us from the central task of rapidly decarbonizing the global economy. Yet no one is seriously proposing embarking on large-scale carbon removal or geoengineering today. We haven’t really figured out how to do the former, and the latter brings a range of potential risks that we don’t yet fully understand. Still, such emergency measures may be necessary in the future even with a steep cut in emissions. As in the case of adaptation, however, the twin fictions that the two-degree limit remains a plausible goal and that dangerous climate change can be avoided should we achieve it allow the moral hazard argument to be marshaled against even sensible calls for serious public research.

A PRACTICAL PATH FORWARD

At this point, if there is a moral hazard argument to be made, it is against the two-degree threshold, not for it. Humans are going to live on a significantly hotter planet for many centuries. The notion that two degrees remains an achievable target risks diverting attention from steps we might take today to better weather the changes that are coming. Once the world lets go of the unrealistic two-degree target, a range of practical policies comes much more clearly into focus.

We should do all that we can to speed up decarbonization. Accelerating the coal-to-gas transition and continuing the deployment of today’s renewable energy technologies would incrementally reduce climate risk even if neither is capable of decarbonizing economies at rates consistent with achieving the two-degree target. At the same time, it is important to support those efforts in ways that don’t lock out technologies that will be necessary to achieve deeper emissions cuts over the longer term. Continuing subsidies for low-efficiency solar panels, for instance, have shut higher-efficiency solar technologies out of the renewables market. Cheap gas has rendered many nuclear power plants, which don’t get the same privileged access to electrical grids or direct production subsidies as do wind and solar energy, uneconomical. At relatively low overall shares of electricity generation, variable sources of power such as wind and solar risk crowding out other zero-carbon options that will be necessary to fully decarbonize power grids. And if deep decarbonization is the objective, much greater public investment will be needed to develop and commercialize clean energy technologies, even though those technologies are unlikely to contribute much to emissions-cutting efforts over the next several decades.

Meanwhile, we need to stop trying to balance the increasingly parsimonious carbon emissions budgets entailed by a two-degree target on the backs of the global poor. There is no moral justification for denying those populations the benefits of fossil-fuel-driven development. Lower-emissions levels associated with curtailed development will not provide any meaningful amelioration of climate extremes for many decades to come, whereas the benefits that come with development will make those populations substantially more resilient to climate extremes right now.

Finally, the world must get serious about researching carbon removal and geoengineering and developing the international institutions and governance frameworks necessary to use them, not out of the certainty that we will eventually need them but through an abundance of caution that we might.

From its earliest days, climate policy and advocacy has always been predicated, sometimes explicitly and always implicitly, on the idea that climate change was a problem that could be solved. The two-degree threshold is a reflection of that impulse. In reality, climate change is now a permanent condition of the human present and future, one that we will manage more or less successfully but that we will never solve. Liberating international climate policy efforts from the various constraints that the two-degree threshold imposes can’t eliminate all of the risks that climate change will bring. But doing so might allow us to manage them better. 

This article was originally published on ForeignAffairs.com.

Estrogen therapy may reduce risk of long-term health problems associated with ovary removal

Posted: 08 Feb 2018 05:34 AM PST

DEAR MAYO CLINIC: I have been using an estrogen hormone patch for two years since having a hysterectomy at 38. I had my ovaries removed as part of the procedure. How often should I have my estrogen levels tested, and how long will I need to continue hormone replacement?

ANSWER: For a woman in your situation, estrogen replacement therapy typically is recommended (assuming there is no medical reason not to use estrogen) until the average age of natural menopause — usually around 51. This is done mainly to reduce the risk of long-term health problems associated with removal of the ovaries. To ensure you’re receiving the right dose, it’s a good idea to have your estrogen level checked at least once a year, and eight to 12 weeks after any dose changes.

A hysterectomy is surgical removal of the uterus. As in your case, the procedure often is combined with removal of the ovaries — a surgery known as an oophorectomy. If the surgery involves removing both ovaries, it’s called a bilateral oophorectomy. When only one ovary is removed, it’s a unilateral oophorectomy. Because the ovaries make the main hormones responsible for a woman’s menstrual cycle, removing your ovaries results in menopause.

When both ovaries are removed before a woman goes through menopause naturally, there is an increase in the risk of a number of serious long-term health problems. They include heart disease, cognitive dysfunction and dementia, mood disorders, bone thinning and early death. The younger a woman is when she has bilateral oophorectomy, the higher the risk.

Because of these risks, bilateral oophorectomy is less common now than it was in the past. In some cases, however, the procedure may not be avoidable, particularly for women who require the surgery due to gynecologic cancer or are at high risk for developing ovarian cancer, such as women who have a BRCA gene mutation.

Estrogen replacement therapy can provide some protection against the health risks that result from bilateral oophorectomy. It also can ease menopause symptoms, such as hot flashes, night sweats and vaginal dryness. The current practice is to use estrogen-based hormone therapy at least until the natural age of menopause, unless there is a medical reason a woman shouldn’t receive it. For example, in women who have had breast cancer, estrogen replacement may not be appropriate.

In general, premenopausal women who have a bilateral oophorectomy are prescribed a dose of estrogen about two to three times higher than the dose that is used to control menopause symptoms in women going through natural menopause. This dose usually results in estrogen levels comparable to those found in a woman prior to menopause.

As in your case, a common way to receive estrogen replacement is through a patch that’s placed on the skin. This is called an estradiol patch. Estrogen replacement also can be taken in pill form. Using an estradiol patch that delivers 100 micrograms per day of the medication, or oral estradiol of 2 milligrams per day, typically results in an average estradiol level of 100 picograms per milliliter.

It is not standard practice to check estradiol levels in women on replacement therapy. Instead, the dose of estrogen replacement therapy typically is adjusted, as needed, to control menopausal symptoms effectively. That said, for young women like yourself, it is a good practice to have your estradiol levels checked annually and after dose changes to ensure that the level is around the desired goal of 100 picograms per milliliter. — Ekta Kapoor, M.B.B.S., Women’s Health Clinic, Mayo Clinic, Rochester, Minn.

(Mayo Clinic Q & A is an educational resource and doesn’t replace regular medical care. E-mail a question to MayoClinicQ&A@mayo.edu. For more information, visit www.mayoclinic.org.)

Calcium and exercise both important for bone health

Posted: 08 Feb 2018 05:33 AM PST

DEAR MAYO CLINIC: My doctor says that exercise is even better than calcium supplements for helping maintain bone density and prevent fractures. Can you explain why?

ANSWER: Both calcium and physical activity are important for bone health. But when you consider the net benefits of calcium, especially in supplement form, it’s unlikely to serve as a good substitute for regular exercise.

Calcium is an important mineral that your body uses to build and maintain strong bones. Foods that are high in calcium include dairy products, dark green leafy vegetables and certain fish, such as sardines. Various foods and beverages, such as cereals and fruit juices, may be fortified with calcium and vitamin D, as vitamin D enhances absorption of calcium.

Calcium in supplement form may help people who can’t get enough calcium from their diet or for those who poorly absorb calcium because of conditions such as untreated celiac disease or bariatric surgery.

However, recent evidence suggests that increasing calcium intake through supplements has a modest and limited effect on bone density. Calcium supplements also can have certain side effects. They can cause constipation, interfere with other drugs and, at higher doses, may be linked to the development of kidney stones. Studies suggest a potential link between excessive amounts of calcium and conditions such as heart disease and prostate cancer.

On the other hand, regular exercise that uses a variety of muscle groups and includes some strength training helps you build a protective framework around your skeleton. It also helps you move more easily and improves your balance. Exercise helps decrease your risk of falling and breaking a bone, which is the ultimate concern.

Ingesting the recommended daily amounts of calcium primarily through dietary sources and staying physically active appear to be the best approaches to limit your fracture risk. — Matthew T. Drake, M.D., Ph.D., Hematology, Mayo Clinic, Rochester, Minn.

Readers: Recent evidence suggests that what you eat can influence your risk of various diseases. In a JAMA study published in March 2017, scientists determined the top foods likely to lead to death from cardio-metabolic health conditions, such as heart disease, stroke and Type 2 diabetes.

Based on data representative of the U.S. population, researchers concluded that some foods and dietary factors detrimental to health are consumed in excess. Such foods include sodium, processed meats, unprocessed red meats and sugar-sweetened beverages. Meanwhile, foods beneficial to health aren’t consumed enough, researchers said. Such foods include nuts; seeds; fish; vegetables; fruits; whole grains; and polyunsaturated fats, such as olive oil.

Mayo Clinic health care providers advise eating a diet focused on vegetables; fruits; whole grains; nuts; seeds; low-fat dairy products; healthy fats, such as olive or vegetable oils; and lean proteins, including fish and beans. Approximately three-fourths of the sodium people consume comes from processed food.

If you think you might be getting too much sodium in your diet, consider eating mostly unprocessed whole foods and adding flavor with spices rather than salt. — Adapted from Mayo Clinic Health Letter

(Mayo Clinic Q & A is an educational resource and doesn’t replace regular medical care. E-mail a question to MayoClinicQ&A@mayo.edu. For more information, visit www.mayoclinic.org.)

Berlin: Russell Brand to Star as Hitman in ‘Butterfingers’

Posted: 08 Feb 2018 05:03 AM PST

Russell Brand will play a lonely, down-on-his-luck hitman in indie comedy caper “Butterfingers.” The film will follow Keith (Brand), also known in his world as Butterfingers, who becomes involved in a race to complete a hit ahead of his arch-rival, with two kids he has kidnapped along the way in tow.

Production gets underway in the U.K. in July. Barnaby Southcombe (“I, Anna”) directs from a script by Tom Nash. The Fyzz Facility is financing. Mark Lane, James Harris, Wayne Marc Godfrey, Robert Jones of The Fyzz Facility will produce alongside Embargo Films.

“Tom has written a brilliantly funny script that finds a new fan with every read. We could not be more excited to be working with Russell and Barnaby to bring ‘Butterfingers’ alive.” said Lane.

Highland Film Group will handle global sales and introduce the film to buyers at the Berlin Film Festival.

Comedian and actor Brand was last seen in Larry Charles’ “Army of One.” His other movie credits include “Forgetting Sarah Marshall” and Get Him to the Greek.” He is represented by WME and Hannah Chambers Management.

Viacom 1Q Profit Boosted by Tax Cut, While Revenue Falls At Film, TV Operations

Posted: 08 Feb 2018 04:19 AM PST

Viacom said net income in its first fiscal quarter increased despite revenue shortfalls at both its film and TV operations, as the results of cost controls and the recent federal tax cut boosted its bottom line.

The New York owner of MTV, Nickelodeon and the Paramount movie studio said revenue fell 7.6% to $3.07 billion, compared with $3.32 billion in the year-earlier period. The chief factor in the drop appears to be a reduction in the fees the company collects from U.S. cable and satellite distributors.

Viacom during the quarter notched “improvements in our business and positioning the company for the future,” said Bob Bakish, Viacom’s CEO, in a prepared statement.

Viacom reported earnings from continuing operations of $535 million, or $1.33 a share, compared with $396 million, or $1 a share, in the year-earlier period.

But the company faced headwinds in its two chief lines of business, cable television and filmed entertainment.

Viacom’s cable-TV networks saw operating income fall 7%, to $913 million. Ad revenue rose 1% to $1.31 billion, but revenue from affiliate fees dipped 4%, to $1.09 billion. U.S. ad revenue fell 5%, owing to lower linear viewership. The company recently launched the Paramount Network, a cable-TV operation designed to feature high-end series and movies. MTV, Viacom’s flagship network, saw ratings increase.

Viacom’s filmed-entertainment operations saw its operating losses narrow to $130 million from $180 million. Overall revenue fell 28% to $544 million. Viacom has been working to turn around its Paramount operations under a new management team.

More to come….

 

529 Savings Plans Have More Uses — But States Need to Catch Up

Posted: 08 Feb 2018 04:17 AM PST

[Question]What are the new rules for using money in a 529 college-savings plan for kindergarten through 12th-grade expenses, rather than just college?

[Answer]The new tax law expanded the definition of eligible expenses for 529s. You can now withdraw up to $10,000 from a 529 each year tax-free to pay tuition for kindergarten through 12th grade. You can still use 529 money tax-free for college expenses, too, with no annual limit. Eligible college expenses include tuition and fees, room and board (even an off-campus apartment for a student who goes to school at least half-time), books and a computer. For more information about eligible expenses, see the “Qualified Tuition Program” section of IRS Publication 970, Tax Benefits for Education.

SEE ALSO: How Well Do You Know 529 Plans?

The new tax law left some unknowns. Some states need to change their laws to coordinate with the new federal law. Otherwise, they could end up charging state income taxes and a 10% penalty for withdrawals that aren’t used for college, says Susie Bauer, senior vice president and 529 manager at Baird Private Wealth Management. Also, if you received a tax deduction for your contribution, you may have to repay it if you use the money for non-qualified expenses and your state doesn’t change its rules. Contact your plan before taking money out of the 529 for K-12 tuition to make sure the withdrawal is a qualified expense in your state. If your state’s rules aren’t clear yet, you may want to wait a few months before taking a precollege withdrawal. More states should be clarifying their laws in the next few months.

You may also want to adjust your investments. “If you’re going to take advantage of the K-12 tuition change, you need to take into account your time horizon,” says Roger Young, a senior financial planner at T. Rowe Price. Depending on your child’s age, you may be taking those withdrawals much earlier than you had originally intended, particularly if you invested in an age-based fund whose mix of stocks and bonds are tied to the year your child will start college. In that case, you may want to shift some money you plan to withdraw for K-12 expenses in the next few years to more conservative investments, so you won’t have to worry about market volatility when the tuition bill is due. And because you can change your 529 investments only twice per year, consider keeping the current money invested where it is but adding new contributions to conservative investments for short-term expenses, says Young.

Even though you can now use some money from your 529 for K-12 tuition, think carefully before taking that withdrawal. “With the shorter time horizon, you’re probably going to be more conservative and will probably have less gain potential,” says Young. “For people who aren’t in a high tax bracket, you have to ask: ‘Does this really make sense to me?’ ” The longer you keep the money growing in the account, the more time you’ll have to benefit from the tax-free gains for eligible expenses.

SEE ALSO: What the New Tax Law Means for Students

Copyright 2018 The Kiplinger Washington Editors

All contents copyright 2018 The Kiplinger Washington Editors, Inc. Distributed by Tribune Content Agency, LLC

Twitter Posts First-Ever Profit on Strong Q4 Results as User Growth Stalls

Posted: 08 Feb 2018 04:08 AM PST

Twitter delivered the first profitable quarter in its nearly 12-year history — although it failed to increase its total monthly user base in the period.

The social network’s financial results handily beat Wall Street expectations. Twitter posted fourth quarter revenue of $732 million, up 2% year-over-year and reversing the trend of declining top-line numbers over the past few quarters.

In Q4, quarterly GAAP (generally accepted accounting principles) net income was $91 million, versus a net loss of $167 million in the year-earlier period. Adjusted earning per share were 19 cents. Wall Street expected revenue of $687 million and EPS of 14 cents.

Twitter averaged 330 million monthly active users in Q4, unchanged from the previous quarter. In part, the company said that was because it stepped up efforts to shut down “spam, malicious automation, and fake accounts.” Analysts had anticipated 2 million MAU net adds in the period. In Q4 2016, Twitter gained an average 1 million monthly users.

“Q4 was a strong finish to the year,” Twitter CEO Jack Dorsey said in announcing the results. “We returned to revenue growth, achieved our goal of GAAP profitability, increased our shipping cadence, and reached five consecutive quarters of double digit DAU [daily active user] growth. I’m proud of the steady progress we made in 2017, and confident in our path ahead.”

Twitter shares were up as much as 14% in premarket trading — hitting a two-year high. The company had previously said it “will likely” turn a profit in Q4 on a GAAP basis.

Although Twitter’s MAUs of 330 million for the quarter were flat, it was up 4% year-over-year. In addition, the company said average daily active users grew 12% year-over-year.

According to Twitter, a change Apple made to the Safari browser’s third-party app integration resulted in a loss of approximately 2 million MAUs in Q4 (1 million in the U.S. and 1 million internationally). It also cited “seasonality and increased information-quality efforts” as affecting MAU growth. Monthly users in the U.S. averaged 68 million for the period, up 2% year-over-year and a decrease of 1 million quarter-over-quarter, reflecting the impact of the change in Safari.

On the revenue front, Twitter’s ad revenue on owned-and-operated platforms for Q4 was $593 million — up 7% year-over-year and a whopping 30% increase from the previous quarter. Twitter cited user-engagement growth and better sales execution as contributing to the lift. Video remains Twitter’s largest ad format and grew as a percent of total revenue in Q4.

Meanwhile, Twitter has continued to suffer from high-level executive exits. Last month, COO Anthony Noto left the company to join lending startup SoFi as chief executive.

Amitabh Bachchan, Rishi Kapoor Star in ‘102 Not Out’ for Sony (EXCLUSIVE)

Posted: 08 Feb 2018 04:00 AM PST

India‘s top veteran actors, Amitabh Bachchan and Rishi Kapoor will star in “102 Not Out,” a comedy about living life to the full. Sony Pictures International Productions India has boarded the film as co-producer and worldwide distributor.

Umesh Shukla (“OMG Oh My God!”) directs and co-produces through his Benchmark Pictures, alongside Treetop Entertainment and Sony.

The film is based on a Gujarati stage play by Saumya Joshi, about a father and son love story. Joshi wrote the adapted screenplay. “The idea of adapting (the play) into a Hindi feature came from the fact that the relationship between a parent and a child is universal and the one where love supersedes everything else, including age! Amitabh Sir and Rishi Sir together are a dream cast for anyone,” said Shukla.

Not for the first time in his career, superstar Bachchan has to use prosthetics and makeup to significantly change his age – in 2009 comedy “Paa” Bachchan aged backwards. In “Not Out” he plays a 102-year-old man, father to a 75-year-old (Kapoor). The pair last performed together some 27 years ago in “Ajooba.”

The film is set for release on May 4, 2018, and Sony is beginning its marketing campaign by attaching a trailer to “Pad Man,” its high-profile biopic which releases in India and international territories from Friday.

Sony is one of the leading Hollywood players in India and “102 Not Out” is the latest in an expanding feature production strategy by Sony in India. Sony Pictures International Productions India co-produced “Padman,” while Sony Pictures Networks Productions” is backing sports drama “Soorma” about the turbulent life of field hockey star Sandeep Singh.

The group has multiple TV operations in the country including leading pay-TV channel Sony Entertainment Television, SAB TV and VoD service Sony Liv. It is expected this year to expand into regional broadcasting with the launchof a Marathi-language general entertainment channel. A recent report by consultancy Media Partners Asia, said that India accounts for more than 60% of channel revenues for the Hollywood entertainment conglomerates, with Sony among the three market leaders.

“I know the country loves the younger, fresh and vibrant new generation – as it should deservedly , but who ever said the ‘oldies’ are not capable to do similar .. we know you may not love us, but given a chance we shall not disappoint you either .. and that is as immodest as I can get,” said Bachchan in a prepared statement.

“Being in ‘102 Not Out’ brought back a lot of fond memories and fun I have shared with Amitji during my younger days. I am glad to reunite with him on such a special and unusual film,” said Kapoor.